diff --git a/ACL_PyTorch/built-in/audio/CosyVoice/public_address_statement.md b/ACL_PyTorch/built-in/audio/CosyVoice/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..fd066ecdc22f3744ecd1e1c64bb8f25bb2e51b0a --- /dev/null +++ b/ACL_PyTorch/built-in/audio/CosyVoice/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------|----------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/audio/CosyVoice/diff_300I.patch | shikang12@huawei.com | 作者邮箱 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/audio/CosyVoice/diff_800I.patch | shikang12@huawei.com | 作者邮箱 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/audio/EspNet_for_Pytoch/public_address_statement.md b/ACL_PyTorch/built-in/audio/EspNet_for_Pytoch/public_address_statement.md index 5b53248d32071e3f3acb8d5386171a22f072b080..83e61d100baf1bb07b8f6ec55a197e55d7a740bd 100644 --- a/ACL_PyTorch/built-in/audio/EspNet_for_Pytoch/public_address_statement.md +++ b/ACL_PyTorch/built-in/audio/EspNet_for_Pytoch/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|EspNet_for_Pytoch/url.ini|www.openslr.org/resources/33|下载数据集| +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------|------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/audio/EspNet_for_Pytoch/url.ini | www.openslr.org/resources/33 | 数据集网址 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/cv/FCENet_for_PyTorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/FCENet_for_PyTorch/public_address_statement.md index a5e4591f245b541fa546938251534aa350d4e2c6..3dc58c0e2f5a2a041048a64ebbfcb610cca05ffc 100644 --- a/ACL_PyTorch/built-in/cv/FCENet_for_PyTorch/public_address_statement.md +++ b/ACL_PyTorch/built-in/cv/FCENet_for_PyTorch/public_address_statement.md @@ -1,5 +1,3 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|FCENet/url.ini|https://download.openmmlab.com/mmocr/textdet/|下载数据集| -|开发引入|/|FCENet/url.ini|https://github.com/open-mmlab/mmdeploy|下载数据集| +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------|-----------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/FCENet_for_PyTorch/url.ini | https://download.openmmlab.com/mmocr/textdet/ | 模型网址 | diff --git a/ACL_PyTorch/built-in/cv/MobileNetV2_for_Pytorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/MobileNetV2_for_Pytorch/public_address_statement.md index 01d22e9aba8350062827de08acc9b52938785124..357e71f0aaa1650641662fcfd17b9377948df11a 100644 --- a/ACL_PyTorch/built-in/cv/MobileNetV2_for_Pytorch/public_address_statement.md +++ b/ACL_PyTorch/built-in/cv/MobileNetV2_for_Pytorch/public_address_statement.md @@ -1,5 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|MobileNetV2_for_Pytorch/url.ini|https://download.pytorch.org/models/mobilenet_v2-b0353104.pth|下载权重| -|开发引入|/|mobilenet.py|https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py|注释说明| -|开发引入|/|mobilenet.py|`"MobileNetV2: Inverted Residuals and Linear Bottlenecks" `_.|注释说明| +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------|---------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/MobileNetV2_for_Pytorch/url.ini | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 模型网址 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/public_address_statement.md index 4cc98a09101dec0036ce09e5dd9fbfac8cd904a0..33d4c8dabf079a7ec9986653e653b530085282c5 100644 --- a/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/public_address_statement.md +++ b/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/public_address_statement.md @@ -1,11 +1,7 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|PSENet_for_Pytorch/url.ini|https://download.pytorch.org/models/resnet18-5c106cde.pth|下载权重| -|开发引入|/|PSENet_for_Pytorch/url.ini|https://download.pytorch.org/models/resnet34-333f7ec4.pth|下载权重| -|开发引入|/|PSENet_for_Pytorch/url.ini|https://download.pytorch.org/models/resnet50-19c8e357.pth|下载权重| -|开发引入|/|PSENet_for_Pytorch/url.ini|https://download.pytorch.org/models/resnet101-5d3mb4d8f.pth|下载权重| -|开发引入|/|PSENet_for_Pytorch/url.ini|https://download.pytorch.org/models/resnet152-b121ed2d.pth|下载权重| -|开发引入|/|fpn_resnet_nearest.py|http://www.apache.org/licenses/|license| -|开发引入|/|fpn_resnet_nearest.py|http://www.apache.org/licenses/LICENSE-2.0|license| -|开发引入|/|Post-processing/Algorithm_DetEva.py|It is slightly different from original algorithm(see https://perso.liris.cnrs.fr/christian.wolf/software/deteval/index.html)|注释说明| +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------|-------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 模型网址 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 模型网址 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 模型网址 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet101-5d3mb4d8f.pth | 模型网址 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/PSENet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 模型网址 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/cv/R(2+1)D_for_Pytorch/public_address_statement.md b/ACL_PyTorch/built-in/cv/R(2+1)D_for_Pytorch/public_address_statement.md index f5803c9f2492ae3479b3e6942b136dfaf4ea404a..ca51240aecedc83be300b7d21e00dc132c20be4e 100644 --- a/ACL_PyTorch/built-in/cv/R(2+1)D_for_Pytorch/public_address_statement.md +++ b/ACL_PyTorch/built-in/cv/R(2+1)D_for_Pytorch/public_address_statement.md @@ -1,4 +1,3 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|R(2+1)D_for_Pytorch/url.ini|https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth|下载权重| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/cv/R(2+1)D_for_Pytorch/url.ini | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 模型网址 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/foundation_models/blip/public_address_statement.md b/ACL_PyTorch/built-in/foundation_models/blip/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..08b884afbc4ea4998b66b8afc7cf4066301e1936 --- /dev/null +++ b/ACL_PyTorch/built-in/foundation_models/blip/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/foundation_models/blip/ascend_infer.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json | 数据集网址 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/foundation_models/blip/ascend_infer.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json | 数据集网址 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/public_address_statement.md b/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/public_address_statement.md index 2e7896502290481b6c6b88c2e868ac1b3915a7ea..c15d433fe5e56067e1af621519951342e757c40d 100644 --- a/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/public_address_statement.md +++ b/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|Bert_Base_Chinese_for_Pytorch/url.ini|https://scikit-learn.org/stable/modules/generated/sklearn.metrics.accuracy_score.html|下载数据集| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------|---------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/nlp/Bert_Base_Chinese_for_Pytorch/url.ini | https://scikit-learn.org/stable/modules/generated/sklearn.metrics.accuracy_score.html | 函数说明网址 | \ No newline at end of file diff --git a/ACL_PyTorch/built-in/nlp/MiniCPM_for_Pytorch/public_address_statement.md b/ACL_PyTorch/built-in/nlp/MiniCPM_for_Pytorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..4d71f1513f6c47507ff44406111462112742ccab --- /dev/null +++ b/ACL_PyTorch/built-in/nlp/MiniCPM_for_Pytorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/built-in/nlp/MiniCPM_for_Pytorch/minicpm.patch | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 函数说明网址 | +| ModelZoo-PyTorch/ACL_PyTorch/built-in/nlp/MiniCPM_for_Pytorch/minicpm.patch | https://arxiv.org/abs/1910.13461 | 论文网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/audio/WeNet/public_address_statement.md b/ACL_PyTorch/contrib/audio/WeNet/public_address_statement.md index 73b5e09436f35c2f91904da4a2f681efff9d026c..9d0a3097c8292cb8b260768fb5a4beba3152af74 100644 --- a/ACL_PyTorch/contrib/audio/WeNet/public_address_statement.md +++ b/ACL_PyTorch/contrib/audio/WeNet/public_address_statement.md @@ -1,4 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|WeNet/url.ini|www.openslr.org/resources/33|下载数据集| -|开发引入|/|ACL_PyTorch/contrib/audio/WeNet/run_static.sh|https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html|引用环境配置说明| +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------|------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/audio/WeNet/url.ini | www.openslr.org/resources/33 | 数据集网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/cv/classfication/Deit_Small/public_address_statement.md b/ACL_PyTorch/contrib/cv/classfication/Deit_Small/public_address_statement.md index 80ca0aa7a95b57cd05ab217d357cc91c0f39a150..4dae6fd29213f2d42731accbee2706df3ff264c0 100644 --- a/ACL_PyTorch/contrib/cv/classfication/Deit_Small/public_address_statement.md +++ b/ACL_PyTorch/contrib/cv/classfication/Deit_Small/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|Deit_Small/url.ini|https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth|下载权重| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------|-------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/classfication/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 模型网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/cv/classfication/R(2+1)D/public_address_statement.md b/ACL_PyTorch/contrib/cv/classfication/R(2+1)D/public_address_statement.md index 846873b8794859b2c011b63878d0533ca3c693b3..ad96b7ebfded12f4ff69dcc2fa322c1d816620f1 100644 --- a/ACL_PyTorch/contrib/cv/classfication/R(2+1)D/public_address_statement.md +++ b/ACL_PyTorch/contrib/cv/classfication/R(2+1)D/public_address_statement.md @@ -1,4 +1,3 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|R(2+1)D/url.ini|https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth|下载权重| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/classfication/R(2+1)D/url.ini | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 模型网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md b/ACL_PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md index 9700779cebfc5c6bc903dec681dd001948aaeda7..6cedff61f92da5e603b84ac096f84ec60d803c96 100644 --- a/ACL_PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md +++ b/ACL_PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md @@ -1,4 +1,3 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|CenterNet/url.ini|http://dl.yf.io/dla/models|下载数据集| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------|----------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/detection/CenterNet/url.ini | http://dl.yf.io/dla/models | 模型网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/cv/detection/FCENet/public_address_statement.md b/ACL_PyTorch/contrib/cv/detection/FCENet/public_address_statement.md index a5e4591f245b541fa546938251534aa350d4e2c6..464d940011c6cafafcde7a8a55380f64fc196c8f 100644 --- a/ACL_PyTorch/contrib/cv/detection/FCENet/public_address_statement.md +++ b/ACL_PyTorch/contrib/cv/detection/FCENet/public_address_statement.md @@ -1,5 +1,4 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|FCENet/url.ini|https://download.openmmlab.com/mmocr/textdet/|下载数据集| -|开发引入|/|FCENet/url.ini|https://github.com/open-mmlab/mmdeploy|下载数据集| +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------|-----------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/detection/FCENet/url.ini | https://download.openmmlab.com/mmocr/textdet/ | 数据集网址 | +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/detection/FCENet/url.ini | https://github.com/open-mmlab/mmdeploy | 代码仓网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/public_address_statement.md b/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/public_address_statement.md index a0321c04d94bc646f4b39de67f418477f7dc6d3b..a9e8aac6de936d78789af637957fac826385bea9 100644 --- a/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/public_address_statement.md +++ b/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/public_address_statement.md @@ -1,5 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|NonLocal/url.ini|http://s3.amazonaws.com/kinetics/400/readme.md|下载公开数据集| -|开发引入|/|NonLocal/url.ini|https://s3.amazonaws.com/kinetics/400/annotations/val.csv|下载公开数据集| -|开发引入|/|NonLocal/url.ini|https://s3.amazonaws.com/kinetics/400/val/k400_val_path.txt|下载公开数据集| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------|-------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/url.ini | https://s3.amazonaws.com/kinetics/400/val/k400_val_path.txt | 数据集网址 | +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/url.ini | https://s3.amazonaws.com/kinetics/400/annotations/val.csv | 数据集网址 | +| ModelZoo-PyTorch/ACL_PyTorch/contrib/cv/video_understanding/NonLocal/url.ini | http://s3.amazonaws.com/kinetics/400/readme.md | 数据集网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/nlp/GNMT/public_address_statement.md b/ACL_PyTorch/contrib/nlp/GNMT/public_address_statement.md index cc4463287ca2b434614a6f77415d7fc911dfa855..d49e77381a4f083253fbc24e0cc37e16f7765941 100644 --- a/ACL_PyTorch/contrib/nlp/GNMT/public_address_statement.md +++ b/ACL_PyTorch/contrib/nlp/GNMT/public_address_statement.md @@ -1,4 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|/|GNMT/url.ini|http://data.statmt.org/wmt16/translation-task/dev.tgz|下载数据集| -|开发引入|/|GNMT/url.ini|https://github.com/moses-smt/mosesdecoder.git|获取开源代码| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------|-------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/nlp/GNMT/url.ini | http://data.statmt.org/wmt16/translation-task/dev.tgz | 数据集网址 | \ No newline at end of file diff --git a/ACL_PyTorch/contrib/nlp/Transformerxl_large/public_address_statement.md b/ACL_PyTorch/contrib/nlp/Transformerxl_large/public_address_statement.md index 39d91cdbe2a03115339b6aa0c03ff0234616d1db..a7f37aa26e4742e07d462edaa9e9f1278fbcb3e6 100644 --- a/ACL_PyTorch/contrib/nlp/Transformerxl_large/public_address_statement.md +++ b/ACL_PyTorch/contrib/nlp/Transformerxl_large/public_address_statement.md @@ -1,5 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开发引入|https://github.com/kimiyoung/transformer-xl.git |getdata.sh |http://mattmahoney.net/dc/enwik8.zip |说明源码出处| -|开发引入|https://github.com/kimiyoung/transformer-xl.git |getdata.sh |https://raw.githubusercontent.com/salesforce/awd-lstm-lm/master/data/enwik8/prep_enwik8.py |说明源码出处| -|开源代码引入|https://github.com/kimiyoung/transformer-xl.git |sample.patch |https://github.com/tensorflow/tensorflow/blob/r1.10/tensorflow/python/ops/candidate_sampling_ops.py |说明源码出处| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------|--------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/ACL_PyTorch/contrib/nlp/Transformerxl_large/getdata.sh | http://mattmahoney.net/dc/enwik8.zip | 数据集网址 | +| ModelZoo-PyTorch/ACL_PyTorch/contrib/nlp/Transformerxl_large/getdata.sh | https://raw.githubusercontent.com/salesforce/awd-lstm-lm/master/data/enwik8/prep_enwik8.py | 代码网址 | \ No newline at end of file diff --git a/PyTorch/built-in/audio/ESPnet2_for_PyTorch/public_address_statement.md b/PyTorch/built-in/audio/ESPnet2_for_PyTorch/public_address_statement.md index f21c28677f3f21e125806f9448954684c55ab74d..e22e6716dc4338650d7a174b0872287d3cfea217 100644 --- a/PyTorch/built-in/audio/ESPnet2_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/audio/ESPnet2_for_PyTorch/public_address_statement.md @@ -1,1503 +1,233 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/doc.sh | https://github.com/kaldi-asr/kaldi | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/install.sh | https://github.com/kpu/kenlm/archive/master.zip | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/install_kaldi.sh | https://github.com/kaldi-asr/kaldi | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/install_kaldi.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.1/ubuntu16-featbin.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/kaldi-asr/kaldi | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/bats-core/bats-core.git | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/koalaman/shellcheck/releases/download/stable/shellcheck-stable.linux.x86_64.tar.xz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/bats-core/bats-core.git | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/doc/conf.py | https://github.com/rtfd/recommonmark/tree/master/doc/ | 文档地址 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/Dockerfile | https://github.com/espnet/espnet | 下载源码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/Dockerfile | https://github.com/cybertronai/pytorch-lamb | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/kaldi-asr/kaldi | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/espnet/kaldi-bin/releases/download/v0.0.1/ubuntu16-featbin.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/aidatatang_200zh/asr1/run.sh | www.openslr.org/resources/62 | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/aishell/asr1/run.sh | www.openslr.org/resources/33 | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/Creative-Commons-Attribution-NonCommercial-ShareAlike-2.5.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://sourceforge.net/projects/nite/files/nite/nxt_1.4.4/nxt_1.4.4.zip | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/$annots.gzip | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/vc1/local/ob_eval/evaluate.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mPf-BxX3t_pqFFV6MGPBRePm5kgNR5sM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1fRLw6EA0x55xa449i_YRjCgm8sgv3hJI | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1v70TtwfmYtTHq9LvksX907mNTEv1G-J1 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1ty_de85SNldzVJSMQrHwl1ASBdGdSRav | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://github.com/fgnt/pb_chime5.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/fisher_callhome_spanish/st1/local/normalize_trans.sh | https://github.com/joshua-decoder/fisher-callhome-corpus.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/how2/st1/local/data_prep_test.sh | https://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt16/mt1/local/download_and_untar.sh | https://wit3.fbk.eu/archive/2016-01/texts/en/de/en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt16/mt1/local/train_and_apply_bpe.sh | https://github.com/rsennrich/subword-nmt | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/data_prep_train.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset10m.incl_paracrawl.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://github.com/saffsd/langid.py | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jnas/tts1/run.shh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jsut/tts1/local/download.sh | https://github.com/r9y9/jsut-lab | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jvs/tts1/local/data_download.sh | https://drive.google.com/open?id=19oAw8wWn3Y7z6CKChRdAyGOB9yupL_Xt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://github.com/desh2608/kaldi-io-for-python.git@vbx | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&confirm= | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&id=1Piioxd5G_85K9Bhcr8ebdhXx0CnaHy7l | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/desh2608/dscore.git | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | https://desh2608.github.io/static/files/jsalt/plda | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libri_css/asr1/run.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/librispeech/asr1/run.sh | www.openslr.org/resources/12 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libritts/tts1/run.sh | www.openslr.org/resources/60 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ljspeech/tts1/local/ob_eval/evaluate_cer.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mboshi_french/st1/local/data_prep.sh | https://github.com/besacier/mboshi-french-parallel-corpus | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Mf2il_VelDIJMSio0bq7I8M9fSs-X4Ie | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=14d2ttsuEUFXsxx-KRWJMsFhQGrYOJcpH | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1acIBqcPVX5QXXXV9u8_yDPtCgfsdEJDV | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1qbK88SAKxqjMUybkMeIjrJWnNAZyE8V0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=11fNraDQs-LiODDxyV5ZW0Slf3XuDq5Cf | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1C5qK1FckA702nsYcXwmGdzlMmHg1F_ot | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1nbdYR5VqcTbLpOB-9cICKCgsLAs7fVzd | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Z3hSiP7fsR3kf8fjQYzIa07jmw4KXNnw | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/file/d/1UBPNwFEVhIZCOEpu4hTqPji57XRg85UO | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/${f} | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/releases/download/v0.5-beta/public_exclude_file_v5.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/files/3386441/exclude_df_youtube_1120.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ftp://ftp.espci.fr/pub/sigma/Features/${feat_dir}/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/TIMIT_training/TIMIT_Transcripts.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/WSJ05K_Test/WSJ0_5K_Transcripts.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vais1000/tts1/local/download.sh | https://drive.google.com/open?id=1HHhLuYhrkk3J6OJctZvgaSd0UgiaROwG | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/data_download.sh | https://github.com/nii-yamagishilab/VCC2020-database.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1UvtFkqdkE8bOCKWXlEltc746JsCKaTMX | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1XYpBZe9-9AgAxGpKfrgQPDjlW2S6duac | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1E6vzNaXT6r7Zybefat_p9ncnMOQCXtem | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11qOvuMGP76BEe_pcPgYdqnWi05MIIYTA | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1y6IFgLMatjh9wspwu-oBba-rPOH1zKlS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=16Q3XOAfI5tG0LZ0SIKE166N3RCtzO722 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1EET2qhBi6nl0DH7UEg0Ez9SfgDX-cFM- | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Vd4Qa8Dm9UQ-LZbyNPRiqoSgOZkGQsQi | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1bvyMfA-zKfO2LEogq-QXhHQeETxdBU29 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rA9ucA-VvhWkcFsGG6izBt2USOZY1_g6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1QfqwnTK0BKO0z_eYqltzL_MeqVGrMiZg | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kWBYSkvaQ0-7CwOfjVaWQYF0vEm0rNyS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=13xDOSo53BSQoF1kD27SdwXoAGqtjtIEM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11KKux-du6fvsMMB4jNk9YH23YUJjRcDV | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1li9DLZGnAheWZrB4oXGo0KWq-fHuFH_l | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/vivos/asr1/run.sh | https://ailab.hcmus.edu.vn/assets/vivos.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/aidatatang_200zh/asr1/local/data.sh | www.openslr.org/resources/62 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/aishell/asr1/local/data.sh | www.openslr.org/resources/33 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/aishell3/tts1/local/data.sh | https://www.openslr.org/resources/93/data_aishell3.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/aishell4/asr1/local/data.sh | https://github.com/DanBerrebbi/AISHELL-4.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/aishell4/asr1/local/data.sh | https://www.openslr.org/resources/111/$room_name.tar.gz -P ${AISHELL4}/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/an4/asr1/local/data.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/chime4/enh1/local/data.sh | http://spandh.dcs.shef.ac.uk/chime_challenge/CHiME4/download.html | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/cmu_arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/commonvoice/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/gigaspeech/asr1/local/data.sh | https://github.com/SpeechColab/GigaSpeech.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/grabo/asr1/local/data.sh | ftp://ftp.esat.kuleuven.be/psi/speech/vrenkens/grabo.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/how2/asr1/local/data.sh | http://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/hui_acg/tts1/local/data_download.sh | https://opendata.iisys.de/systemintegration/Datasets/HUI-Audio-Corpus-German/dataset_clean | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/local/data.sh | https://zenodo.org/record/4541727/files/asr_train_asr_conformer_raw_ru_bpe100_valid.acc.ave.zip?download=1 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/local/data.sh | https://zenodo.org/record/5227612/files/swahili-asr-resources.tar.xz?download=1 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jdcinal/asr1/local/data.sh | http://tts.speech.cs.cmu.edu/awb/infomation_navigation_and_attentive_listening_0.2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jmd/tts1/local/data_download.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1gacw6Ak6rlEZ_gx9KwafIIfc3dU0EAHW | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jmd/tts1/local/data_download.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1mCbmUKVifEEEcm7A3ofqWW7dCqVXGrsh | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jmd/tts1/local/data_download.sh | https://github.com/takenori-y/JMDComplements | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jsss/tts1/local/data_download.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1NyiZCXkYTdYBNtD1B-IMAYCVa-0SQsKX | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jsss/tts1/local/data_download.sh | https://github.com/kan-bayashi/JSSSLabel | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jtubespeech/tts1/local/download.sh | https://drive.google.com/uc?id=1X_harC0e1tjMX1FtCldD67XOysQuq_Ib | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/jv_openslr35/asr1/local/data.sh | https://www.openslr.org/resources/35/asr_javanese_${i}.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librilight_limited/asr1/local/data.sh | https://dl.fbaipublicfiles.com/librilight/data/librispeech_finetuning.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librilight_limited/asr1/local/data.sh | www.openslr.org/resources/12 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librimix/enh1/local/data.sh | https://github.com/JorisCos/LibriMix | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librimix/enh1/local/data.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librispeech/asr1/conf/tuning/train_asr_transformer3_w2v_large_lv60_960h_finetuning_last_1layer.yaml | https://dl.fbaipublicfiles.com/fairseq/wav2vec/wav2vec2_vox_960h_new.pt | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | www.openslr.org/resources/12 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/libritts/tts1/local/data.sh | www.openslr.org/resources/60 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/libritts/tts1/local/data.sh | https://github.com/kan-bayashi/LibriTTSCorpusLabel.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/lrs2/lipreading1/local/data.sh | https://zenodo.org/record/5090353/files/lipread_lrw_pretrain.pt.tgz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1gacw6Ak6rlEZ_gx9KwafIIfc3dU0EAHW | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | http://www.openslr.org/resources/26/sim_rir_8k.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | https://www.openslr.org/resources/17/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/misp2021/avsr1/local/data.sh | https://github.com/mispchallenge/misp2021_baseline.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/mls/asr1/local/data.sh | https://dl.fbaipublicfiles.com/mls/mls_${download_id}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/mls/asr1/local/data.sh | https://dl.fbaipublicfiles.com/mls/mls_lm_${download_id}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/nsc/asr1/local/data.sh | https://github.com/pzelasko/Praat-textgrids | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/primewords_chinese/asr1/local/data.sh | https://www.openslr.org/resources/47/primewords_md_2018_set1.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://github.com/ftshijt/Puebla_Nahuatl_Split.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/reverb/asr1/local/prepare_rir_noise_1ch.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/ruslan/tts1/local/data_download.sh | https://drive.google.com/uc?id=1Y6vv--gcDx-S8DieSGaD7WnB86kZLgc_ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/ruslan/tts1/local/data_download.sh | https://drive.google.com/uc?id=11TD_ZwIOo-Wo75GYv-OWWOS3ABmwmAdK | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/siwis/tts1/local/data_download.sh | https://datashare.ed.ac.uk/download/DS_10283_2353.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/create_database.sh | https://zenodo.org/record/3517889/files/sms_wsj.tar.gz.parta{a,b,c,d,e} | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/data.sh | https://github.com/fgnt/sms_wsj.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/data.sh | https://github.com/boeddeker/rir-generator.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | http://download.tensorflow.org/data/speech_commands_v0.02.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | http://download.tensorflow.org/data/speech_commands_test_set_v0.02.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/su_openslr36/asr1/local/data.sh | https://www.openslr.org/resources/36/asr_sundanese_${i}.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://huggingface.co/${hf_repo} ${dir_repo} | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/scripts/utils/upload_models_to_hub.sh | https://huggingface.co/espnet/${repo_name} | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://huggingface.co/${hf_repo} ${dir_repo} | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://huggingface.co/${hf_repo} ${dir_repo} | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/scripts/km.sh | https://dl.fbaipublicfiles.com/hubert/hubert_base_ls960.pt | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://huggingface.co/${hf_repo} ${dir_repo} | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/thchs30/asr1/local/data.sh | https://www.openslr.org/resources/18/data_thchs30.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/thchs30/asr1/local/data.sh | https://www.openslr.org/resources/18/data_thchs30.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://www.openslr.org/resources/107/Amith-Lopez_Totonac-recordings-northern-Puebla-and-adjacent-Veracruz_Metadata.xml | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://www.openslr.org/resources/107/Totonac_Corpus.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://github.com/ftshijt/Totonac_Split.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/tsukuyomi/tts1/local/data_download.sh | https://tyc.rei-yumesaki.net/files/sozai-tyc-corpus1.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/vctk/tts1/local/data_download.sh | http://www.udialogue.org/download/VCTK-Corpus.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/vctk/tts1/local/data_download.sh | https://github.com/kan-bayashi/VCTKCorpusFullContextLabel.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/vivos/asr1/local/data.sh | https://ailab.hcmus.edu.vn/assets/vivos.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/wham/enh1/local/wham_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/wham/enh1/local/wham_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_scripts.tar.gz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/whamr/enh1/local/whamr_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/whamr/enh1/local/whamr_create_mixture.sh | https://storage.googleapis.com/whisper-public/whamr_scripts.tar.gz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/wsj0_2mix/enh1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | https://www.merl.com/demos/deep-clustering/spatialize_wsj0-mix.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | https://github.com/ehabets/RIR-Generator | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/yesno/asr1/local/data.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/local/data.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/local/data.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/egs2/zeroth_korean/asr1/local/download_and_untar.sh | http://www.openslr.org/resources/40/zeroth_korean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/espnet2/asr/encoder/wav2vec2_encoder.py | https://dl.fbaipublicfiles.com/fairseq/wav2vec/dict.ltr.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/setup.py | http://github.com/espnet/espnet | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/setup.py | shinjiw@ieee.org | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/chainer/chainer | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_chainer_ctc.shh | https://github.com/jheymann85/chainer_ctc.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_fairseq.sh | https://github.com/pytorch/fairseq.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_kenlm.sh | https://github.com/kpu/kenlm.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_pesq.sh | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/speech_tools.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/festival.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/espeak-ng/espeak-ng.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/numediart/MBROLA.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_py3mmseg.sh | https://github.com/kamo-naoyuki/py3mmseg | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_s3prl.sh | https://github.com/s3prl/s3prl.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | https://sourceforge.net/projects/kaldi/files/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/sarulab-speech/tdmelodic_openjtalk.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/r9y9/pyopenjtalk.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc/releases/tag/v${warpctc_version} | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/installers/install_warp-transducer.sh | https://github.com/b-flo/warp-transducer.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/Makefile | https://github.com/kaldi-asr/kaldi | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/Makefile | https://github.com/moses-smt/mosesdecoder.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id=$1&export=download | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/gdown.pl | https://docs.google.com | 前置网址 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1dKzdaDpOkpx7kWZnvrvx2De7eZEdPHZs | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=11T9qw8rJlYzUdXvFjkjQjYrp3iGfQ15h | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1hiZn14ITUDM1nkn-GkaN_M3oaTOUcn1n | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=13DR-RB5wrbMqBGx_MC655VZlsEq52DyS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1xxAwPuUph23RnlC5gym7qDM02ZCW9Unp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1M_w7nxI6AfbtSHpMO-exILnAc_aUYvXP | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=17RUNFLP4SSTbGA01xWRJo7RkR876xM0i | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1zD-2GMrWM3thaDpS3h3rkTU4jIC0wc5B | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1W86YEQ6KbuUTIvVURLqKtSNqe_eI2GDN | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1iAXwC0AuWusa9AcFeUVkcNLG0I-hnSr3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1bTSygvonv5TS6-iuYsOIUWpN2atGnyhZ | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1T8thxkAxjGFPXPWPTcKLvHnd6lG0-82R | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1eA1VcRS9jzFa-DovyTgJLQ_jmwOLIi8L | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1sY7gEUg39QaO1szuN62-Llst9TrFno2t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1tv9GKyRT4CDsvUWKwH3s_OfXkiTi0gw7 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1jHUUmQFjWiQGyDd7ZeiCThSjjpbF_B4h | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=187xvyNbmJVZ0EZ1XHCdyjZHTXK9EcfkK | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1OwrUQzAmvjj1x9cDhnZPp6dqtsEqGEJM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1PsjFRV5eUP0HHwBaRYya9smKy5ghXKzj | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=10M6H88jEUGbRWBmU1Ff2VaTmOAeL8CEy | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder "${MDN_WAVENET_VOC_DIR}" | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1wFIAqxoBUioTKTLRLv29KzvphkUm3qdo | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet2_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1hawp5ZLw4_SIHIT3edglxbKIIkPVe8n3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/translate_wav.sh | ESPnet2_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1wFIAqxoBUioTKTLRLv29KzvphkUm3qdo | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/fisher_callhome_spanish/st1/RESULTS.md | ESPnet2_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1hawp5ZLw4_SIHIT3edglxbKIIkPVe8n3 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/synth_wav.sh | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://github.com/espnet/espnet#tts-demo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://colab.research.google.com/github/espnet/notebook/blob/master/tts_realtime_demo.ipynb | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1dKzdaDpOkpx7kWZnvrvx2De7eZEdPHZs | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=11T9qw8rJlYzUdXvFjkjQjYrp3iGfQ15h | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1hiZn14ITUDM1nkn-GkaN_M3oaTOUcn1n | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=13DR-RB5wrbMqBGx_MC655VZlsEq52DyS | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1xxAwPuUph23RnlC5gym7qDM02ZCW9Unp | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1M_w7nxI6AfbtSHpMO-exILnAc_aUYvXP | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=17RUNFLP4SSTbGA01xWRJo7RkR876xM0i | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1zD-2GMrWM3thaDpS3h3rkTU4jIC0wc5B | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1W86YEQ6KbuUTIvVURLqKtSNqe_eI2GDN | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libritts/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1iAXwC0AuWusa9AcFeUVkcNLG0I-hnSr3 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libritts/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jvs/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jvs/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/csmsc/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1bTSygvonv5TS6-iuYsOIUWpN2atGnyhZ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/csmsc/tts1/RESULTS.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1T8thxkAxjGFPXPWPTcKLvHnd6lG0-82R | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1eA1VcRS9jzFa-DovyTgJLQ_jmwOLIi8L | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1sY7gEUg39QaO1szuN62-Llst9TrFno2t | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1tv9GKyRT4CDsvUWKwH3s_OfXkiTi0gw7 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1jHUUmQFjWiQGyDd7ZeiCThSjjpbF_B4h | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=187xvyNbmJVZ0EZ1XHCdyjZHTXK9EcfkK | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1OwrUQzAmvjj1x9cDhnZPp6dqtsEqGEJM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1PsjFRV5eUP0HHwBaRYya9smKy5ghXKzj | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=10M6H88jEUGbRWBmU1Ff2VaTmOAeL8CEy | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet2_for_PyTorch/utils/spm_train | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet2_for_PyTorch/utils/spm_encode | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet2_for_PyTorch/utils/spm_decode | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/asr1/RESULTS.md | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/pack_model.sh | ESPnet2_for_PyTorch/utils/pack_model.sh | shinjiw@ieee.org | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/json2trn.py | ESPnet2_for_PyTorch/utils/json2trn_wo_dict.py | https://github.com/espnet/espnet/issues/993 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/json2trn.py | ESPnet2_for_PyTorch/utils/json2trn.py | https://github.com/espnet/espnet/issues/993 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet2_for_PyTorch/utils/generate_wav_from_fbank.py | https://github.com/kan-bayashi/PytorchWaveNetVocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/generate_wav_from_fbank.py | ESPnet2_for_PyTorch/utils/generate_wav_from_fbank.py | https://ieeexplore.ieee.org/abstract/document/8461332 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/gdown.pl | ESPnet2_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id= | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/gdown.pl | ESPnet2_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id= | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/data_download.sh | ESPnet2_for_PyTorch/utils/gdown.pl | https://docs.google.com | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet2_for_PyTorch/utils/eval-source-separation.py | https://ieeexplore.ieee.org/document/5495701 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet2_for_PyTorch/utils/eval-source-separation.py | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet2_for_PyTorch/utils/eval-source-separation.py | https://ieeexplore.ieee.org/document/941023 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet2_for_PyTorch/utils/eval-source-separation.py | https://arxiv.org/abs/1804.06267 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet2_for_PyTorch/utils/download_from_google_drive.sh | https://drive.google.com/open?id=1zF88bRNbJhw9hNBq3NrDg8vnGGibREmg | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet2_for_PyTorch/utils/download_from_google_drive.sh | https://qiita.com/namakemono/items/c963e75e0af3f7eed732 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet2_for_PyTorch/utils/download_from_google_drive.sh | https://github.com/wkentaro/gdown | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/asr1/RESULTS.md | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/csj/asr1/RESULTS.md | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/runtime/Dockerfile | ESPnet2_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/tools/Makefile | https://github.com/kaldi-asr/kaldi | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/Makefile | ESPnet2_for_PyTorch/tools/Makefile | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_warp-transducer.sh | ESPnet2_for_PyTorch/tools/installers/install_warp-transducer.sh | https://github.com/b-flo/warp-transducer.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_warp-ctc.sh | ESPnet2_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc/releases/tag/v | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_warp-ctc.sh | ESPnet2_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/anaconda/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/nvidia/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/conda-forge/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/pytorch/pytorch/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_tdmelodic_pyopenjtalk.sh | ESPnet2_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/sarulab-speech/tdmelodic_openjtalk.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_tdmelodic_pyopenjtalk.sh | ESPnet2_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/r9y9/pyopenjtalk.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_sph2pipe.sh | ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sph2pipe_v2.5.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_sph2pipe.sh | ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_sph2pipe.sh | ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | https://sourceforge.net/projects/kaldi/files/sph2pipe_v2.5.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_sctk.sh | ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sctk-2.4.10-20151007-1312Z.tar.bz2 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_sctk.sh | ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2|| | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_sctk.sh | ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_s3prl.sh | ESPnet2_for_PyTorch/tools/installers/install_s3prl.sh | https://github.com/s3prl/s3prl.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_py3mmseg.sh | ESPnet2_for_PyTorch/tools/installers/install_py3mmseg.sh | https://github.com/kamo-naoyuki/py3mmseg | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_phonemizer.sh | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/speech_tools.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_phonemizer.sh | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/festival.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_phonemizer.sh | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/espeak-ng/espeak-ng.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_phonemizer.sh | ESPnet2_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/numediart/MBROLA.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet2_for_PyTorch/tools/installers/install_pesq.sh | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_nkf.sh | ESPnet2_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_mwerSegmenter.sh | ESPnet2_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_kenlm.sh | ESPnet2_for_PyTorch/tools/installers/install_kenlm.sh | https://github.com/kpu/kenlm.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightl | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_fairseq.sh | ESPnet2_for_PyTorch/tools/installers/install_fairseq.sh | https://github.com/pytorch/fairseq.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_fairseq.sh | ESPnet2_for_PyTorch/tools/installers/install_fairseq.sh | https://github.com/pytorch/fairseq.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_chainer_ctc.sh | ESPnet2_for_PyTorch/tools/installers/install_chainer_ctc.sh | https://github.com/jheymann85/chainer_ctc.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet2_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/chainer/chainer | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_chainer.sh | ESPnet2_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/pypa/setuptools/issues/855 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/setup.py | ESPnet2_for_PyTorch/setup.py | http://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/pack_model.sh | ESPnet2_for_PyTorch/setup.py | shinjiw@ieee.org | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/utils/sized_dict.py | ESPnet2_for_PyTorch/espnet2/utils/sized_dict.py | https://github.com/bosswissam/pysize | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/utils/config_argparse.py | ESPnet2_for_PyTorch/espnet2/utils/config_argparse.py | https://github.com/bw2/ConfigArgParse | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_transformer.py | ESPnet2_for_PyTorch/espnet2/tts/transformer/transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/tacotron2/tacotron2.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/gst/style_encoder.py | https://arxiv.org/abs/1803.09017 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/gst/style_encoder.py | https://arxiv.org/abs/1803.09017 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/gst/style_encoder.py | https://arxiv.org/abs/1803.09017 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/tts/feats_extract/dio.py | ESPnet2_for_PyTorch/espnet2/tts/feats_extract/dio.py | https://doi.org/10.1587/transinf.2015EDP7457 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/fastspeech2/variance_predictor.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/fastspeech2/fastspeech2.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/tts/fastspeech2/fastspeech2.py | https://arxiv.org/abs/2006.06873 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet2_for_PyTorch/espnet2/tts/fastspeech/fastspeech.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/espnet2/train/distributed_utils.py | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/train/distributed_utils.py | ESPnet2_for_PyTorch/espnet2/train/distributed_utils.py | https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_group | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/torch_utils/model_summary.py | ESPnet2_for_PyTorch/espnet2/torch_utils/model_summary.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master/pytorch_lightning/core/memory.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/text/phoneme_tokenizer.py | https://doi.org/10.1587/transinf.2020EDP7104 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/text/phoneme_tokenizer.py | https://github.com/bootphon/phonemizer | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/text/phoneme_tokenizer.py | ESPnet2_for_PyTorch/espnet2/text/phoneme_tokenizer.py | https://github.com/bootphon/phonemizer/blob/master/phonemizer/phonemize.py#L32 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/text/phoneme_tokenizer.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/text/korean_cleaner.py | ESPnet2_for_PyTorch/espnet2/text/korean_cleaner.py | https://github.com/hccho2/Tacotron-Wavenet-Vocoder-Korean | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/tasks/hubert.py | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/tasks/hubert.py | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/tasks/abs_task.py | ESPnet2_for_PyTorch/espnet2/tasks/abs_task.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/tasks/abs_task.py | ESPnet2_for_PyTorch/espnet2/tasks/abs_task.py | https://github.com/pytorch/pytorch/blob/master/torch/multiprocessing/spawn.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/schedulers/noam_lr.py | ESPnet2_for_PyTorch/espnet2/schedulers/noam_lr.py | https://arxiv.org/pdf/1706.03762.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet2_for_PyTorch/espnet2/lm/seq_rnn_lm.py | https://github.com/pytorch/examples/blob/4581968193699de14b56527296262dd76ab43557/word_language_model/model.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet2_for_PyTorch/espnet2/lm/seq_rnn_lm.py | https://arxiv.org/abs/1608.05859 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet2_for_PyTorch/espnet2/lm/seq_rnn_lm.py | https://arxiv.org/abs/1611.01462 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/layers/sinc_conv.py | ESPnet2_for_PyTorch/espnet2/layers/sinc_conv.py | https://github.com/mravanelli/SincNet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/voxforge/asr1/README_LightweightSincConvs.md | ESPnet2_for_PyTorch/espnet2/layers/sinc_conv.py | https://arxiv.org/abs/2010.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/iterators/sequence_iter_factory.py | ESPnet2_for_PyTorch/espnet2/iterators/sequence_iter_factory.py | https://discuss.pytorch.org/t/what-is-the-disadvantage-of-using-pin-memory/1702 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/hubert/hubert_loss.py | ESPnet2_for_PyTorch/espnet2/hubert/hubert_loss.py | https://github.com/pytorch/fairseq/blob/master/fairseq/criterions/hubert_criterion.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/hubert/hubert_loss.py | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/hubert/hubert_loss.py | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/hubert/espnet_model.py | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/hubert/espnet_model.py | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/wavenet/wavenet.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/wavenet/residual_block.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/vits.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/vits/transform.py | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/transform.py | https://github.com/bayesiains/nflows | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/text_encoder.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/text_encoder.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/residual_coupling.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/residual_coupling.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/posterior_encoder.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/posterior_encoder.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/monotonic_align/core.pyx | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/monotonic_align/__init__.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/loss.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/generator.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/generator.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/flow.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/duration_predictor.py | https://github.com/jaywalnut310/vits | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/vits/duration_predictor.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/style_melgan/tade_res_block.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/style_melgan/style_melgan.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/parallel_wavegan/upsample.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/parallel_wavegan/parallel_wavegan.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/residual_stack.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/pqmf.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/melgan/pqmf.py | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/pqmf.py | https://ieeexplore.ieee.org/abstract/document/681427 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/melgan/pqmf.py | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/pqmf.py | https://ieeexplore.ieee.org/document/258122 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/melgan/pqmf.py | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/pqmf.py | https://github.com/kan-bayashi/ParallelWaveGAN/issues/195 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/melgan.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/melgan/melgan.py | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/melgan.py | https://github.com/descriptinc/melgan-neurips/blob/master/mel2wav/modules.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/melgan/melgan.py | ESPnet2_for_PyTorch/espnet2/gan_tts/melgan/melgan.py | https://github.com/descriptinc/melgan-neurips/blob/master/mel2wav/modules.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/hifigan/residual_block.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/hifigan/loss.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet2/gan_tts/hifigan/hifigan.py | https://github.com/kan-bayashi/ParallelWaveGAN | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/gan_tts/hifigan/hifigan.py | ESPnet2_for_PyTorch/espnet2/gan_tts/hifigan/hifigan.py | https://github.com/jik876/hifi-gan/blob/master/models.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/fileio/rttm.py | ESPnet2_for_PyTorch/espnet2/fileio/rttm.py | https://catalog.ldc.upenn.edu/docs/LDC2004T12/RTTM-format-v13.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/separator/asteroid_models.py | ESPnet2_for_PyTorch/espnet2/enh/separator/asteroid_models.py | https://github.com/asteroid-team/asteroid/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/separator/asteroid_models.py | ESPnet2_for_PyTorch/espnet2/enh/separator/asteroid_models.py | https://github.com/asteroid-team/asteroid/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/separator/asteroid_models.py | ESPnet2_for_PyTorch/espnet2/enh/separator/asteroid_models.py | https://huggingface.co/models?filter=asteroid | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/enh1/README.md | ESPnet2_for_PyTorch/espnet2/enh/separator/asteroid_models.py | https://github.com/asteroid-team/asteroid | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/loss/criterions/time_domain.py | ESPnet2_for_PyTorch/espnet2/enh/loss/criterions/time_domain.py | https://arxiv.org/abs/2011.15003 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/wpe.py | ESPnet2_for_PyTorch/espnet2/enh/layers/wpe.py | https://github.com/fgnt/nara_wpe | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/tcn.py | ESPnet2_for_PyTorch/espnet2/enh/layers/tcn.py | https://github.com/kaituoxu/Conv-TasNet/blob/master/src/conv_tasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/dprnn.py | ESPnet2_for_PyTorch/espnet2/enh/layers/dprnn.py | https://github.com/yluo42/TAC/blob/master/utility/models.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/dnn_beamformer.py | ESPnet2_for_PyTorch/espnet2/enh/layers/dnn_beamformer.py | http://proceedings.mlr.press/v70/ochiai17a/ochiai17a.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/beamformer.py | ESPnet2_for_PyTorch/espnet2/enh/layers/beamformer.py | https://ieeexplore.ieee.org/document/5089420 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/beamformer.py | ESPnet2_for_PyTorch/espnet2/enh/layers/beamformer.py | https://ieeexplore.ieee.org/document/5089420 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/beamformer.py | ESPnet2_for_PyTorch/espnet2/enh/layers/beamformer.py | https://ieeexplore.ieee.org/document/8691481 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/beamformer.py | ESPnet2_for_PyTorch/espnet2/enh/layers/beamformer.py | https://ieeexplore.ieee.org/document/8691481 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/enh/layers/beamformer.py | ESPnet2_for_PyTorch/espnet2/enh/layers/beamformer.py | https://gitlab.uni-oldenburg.de/hura4843/deep-mfmvdr/-/blob/master/deep_mfmvdr | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/conf/train_diar_eda.yaml | ESPnet2_for_PyTorch/espnet2/diar/espnet_model.py | https://arxiv.org/pdf/1909.06247.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/conf/train_diar_eda.yaml | ESPnet2_for_PyTorch/espnet2/diar/espnet_model.py | https://arxiv.org/pdf/2005.09921.p | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/espnet2/diar/espnet_model.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/espnet2/diar/espnet_model.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/launch.py | ESPnet2_for_PyTorch/espnet2/bin/launch.py | https://pytorch.org/docs/stable/distributed.html#initialization | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/train/distributed_utils.py | ESPnet2_for_PyTorch/espnet2/bin/launch.py | https://pytorch.org/docs/stable/distributed.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/launch.py | ESPnet2_for_PyTorch/espnet2/bin/launch.py | https://discuss.pytorch.org/t/why-torch-nn-parallel-distributeddataparallel-runs-faster-than-torch-nn-dataparallel-on-single-machine-with-multi-gpu/32977/2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/train/distributed_utils.py | ESPnet2_for_PyTorch/espnet2/bin/launch.py | https://pytorch.org/docs/stable/distributed.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/train/distributed_utils.py | ESPnet2_for_PyTorch/espnet2/bin/launch.py | https://pytorch.org/docs/stable/distributed.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/asr_inference_k2.py | ESPnet2_for_PyTorch/espnet2/bin/asr_inference_k2.py | https://github.com/k2-fsa/snowfall/blob/master/snowfall/training/ctc_graph.py#L13 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/asr_inference_k2.py | ESPnet2_for_PyTorch/espnet2/bin/asr_inference_k2.py | https://github.com/k2-fsa/snowfall/blob/master/snowfall/common.py#L309 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/asr_inference_k2.py | ESPnet2_for_PyTorch/espnet2/bin/asr_inference_k2.py | https://k2-fsa.github.io/k2/core_concepts/index.html#dense-fsa-vector | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/asr_inference_k2.py | ESPnet2_for_PyTorch/espnet2/bin/asr_inference_k2.py | https://github.com/k2-fsa/k2/blob/master/k2/python/k2/fsa_algo.py#L308 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/asr_inference_k2.py | ESPnet2_for_PyTorch/espnet2/bin/asr_inference_k2.py | https://github.com/k2-fsa/k2/blob/master/k2/python/k2/autograd.py#L648 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet2_for_PyTorch/espnet2/bin/asr_align.py | https://arxiv.org/abs/2007.09127 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/bin/asr_align.py | ESPnet2_for_PyTorch/espnet2/bin/asr_align.py | https://github.com/lumaku/ctc-segmentation | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet2/asr/transducer/beam_search_transducer.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet2/asr/transducer/beam_search_transducer.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet2/asr/transducer/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9053040 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet2/asr/transducer/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9053040 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet2/asr/transducer/beam_search_transducer.py | https://arxiv.org/pdf/2002.03577.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet2/asr/transducer/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9250505 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/voxforge/asr1/README_LightweightSincConvs.md | ESPnet2_for_PyTorch/espnet2/asr/preencoder/sinc.py | https://arxiv.org/abs/2010.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/asr/encoder/wav2vec2_encoder.py | ESPnet2_for_PyTorch/espnet2/asr/encoder/wav2vec2_encoder.py | https://dl.fbaipublicfiles.com/fairseq/wav2vec/dict.ltr.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/asr/encoder/hubert_encoder.py | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/espnet2/asr/encoder/hubert_encoder.py | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/asr/encoder/hubert_encoder.py | ESPnet2_for_PyTorch/espnet2/asr/encoder/hubert_encoder.py | https://github.com/pytorch/fairseq/blob/master/fairseq/models/hubert/hubert.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet2/asr/encoder/contextual_block_transformer_encoder.py | ESPnet2_for_PyTorch/espnet2/asr/encoder/contextual_block_transformer_encoder.py | https://arxiv.org/abs/1910.07204 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet2/asr/encoder/conformer_encoder.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/decoders.py | ESPnet2_for_PyTorch/espnet2/asr/decoder/rnn_decoder.py | https://arxiv.org/pdf/1409.2329.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/vc/pytorch_backend/vc.py | https://arxiv.org/abs/1905.09263 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet2_for_PyTorch/espnet/utils/spec_augment.py | https://github.com/zcaceres/spec_augment | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet2_for_PyTorch/espnet/utils/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet2_for_PyTorch/espnet/utils/spec_augment.py | https://github.com/zcaceres/spec_augment | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet2_for_PyTorch/espnet/utils/spec_augment.py | https://en.wikipedia.org/wiki/Polyharmonic_spline | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/deterministic_utils.py | ESPnet2_for_PyTorch/espnet/utils/deterministic_utils.py | https://github.com/pytorch/pytorch/issues/6351 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/tts/pytorch_backend/tts.py | https://arxiv.org/abs/1905.09263 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet2_for_PyTorch/espnet/transform/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/transform/perturb.py | ESPnet2_for_PyTorch/espnet/transform/perturb.py | https://groups.google.com/forum/#!topic/kaldi-help/8OOG7eE4sZ8 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/transform/perturb.py | ESPnet2_for_PyTorch/espnet/transform/perturb.py | http://spandh.dcs.shef.ac.uk/chime_workshop/papers/CHiME_2018_paper_kanda.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet2_for_PyTorch/espnet/st/pytorch_backend/st.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet2_for_PyTorch/espnet/scheduler/scheduler.py | https://openreview.net/pdf?id=BJYwwY9ll | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet2_for_PyTorch/espnet/scheduler/scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet2_for_PyTorch/espnet/scheduler/scheduler.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet2_for_PyTorch/espnet/nets/scorers/ctc.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet2_for_PyTorch/espnet/nets/scorers/ctc.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/wavenet.py | https://github.com/kan-bayashi/PytorchWaveNetVocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/wavenet.py | https://arxiv.org/abs/1611.09482 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/multi_layer_conv.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/lightconv2d.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/lightconv.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/encoder.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/encoder.py | https://github.com/espnet/espnet/commit/21d70286c354c66c0350e65dc098d2ee236faccc#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/decoder.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/encoder.py | https://github.com/espnet/espnet/commit/3d422f6de8d4f03673b89e1caef698745ec749ea#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1809.08895 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/dynamic_conv.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/decoder.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/decoder.py | https://github.com/espnet/espnet/commit/3d422f6de8d4f03673b89e1caef698745ec749ea#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/argument.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/argument.py | https://arxiv.org/abs/1912.11793v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/argument.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transformer/argument.py | https://arxiv.org/abs/1901.10430 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transducer/arguments.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transducer/arguments.py | https://arxiv.org/abs/2010.11148 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/encoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/decoder.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1606.01305 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/decoder.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://github.com/eladhoffer/seq2seq.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1710.07654 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://arxiv.org/abs/1703.10135 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://github.com/pytorch/pytorch/pull/6327 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/decoders.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/decoders.py | https://arxiv.org/pdf/1409.2329.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1710.07654 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1704.04368 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1807.06736.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1807.06736.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://github.com/pytorch/examples/blob/4581968193699de14b56527296262dd76ab43557/word_language_model/model.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://arxiv.org/abs/1608.05859 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://arxiv.org/abs/1611.01462 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/default.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/lm/default.py | https://github.com/espnet/espnet/issues/1075 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/gtn_ctc.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/gtn_ctc.py | https://github.com/facebookresearch/gtn_applications/blob/master/utils.py#L251 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/dnn_beamformer.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/frontends/dnn_beamformer.py | https://arxiv.org/abs/1703.04783 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/beamformer.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/frontends/beamformer.py | https://ieeexplore.ieee.org/document/5089420 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/fastspeech/duration_predictor.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_vc_transformer.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_vc_transformer.py | https://arxiv.org/pdf/1912.06813.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_transformer.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_fastspeech.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_st_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_st.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | https://arxiv.org/pdf/1811.04903.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_asr_mix_transformer.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mix_transformer.py | https://arxiv.org/pdf/2002.03921.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mix.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/asr_recog.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_maskctc.py | https://arxiv.org/abs/2005.08700 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/ctc.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/ctc.py | https://github.com/pytorch/pytorch/issues/17798 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/conformer/argument.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/chainer_backend/rnn/training.py | ESPnet2_for_PyTorch/espnet/nets/chainer_backend/transformer/training.py | https://github.com/chainer/chainer/blob/master/chainer/optimizer.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/chainer_backend/rnn/training.py | ESPnet2_for_PyTorch/espnet/nets/chainer_backend/rnn/training.py | https://github.com/chainer/chainer/blob/master/chainer/optimizer.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9053040 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9053040 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/2002.03577.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet2_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9250505 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet2_for_PyTorch/espnet/nets/batch_beam_search_online_sim.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet2_for_PyTorch/espnet/nets/batch_beam_search_online_sim.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet2_for_PyTorch/espnet/nets/batch_beam_search_online.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet2_for_PyTorch/espnet/nets/batch_beam_search_online.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search.py | ESPnet2_for_PyTorch/espnet/nets/batch_beam_search.py | https://github.com/espnet/espnet/pull/1402#discussion_r354561029 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet2_for_PyTorch/espnet/mt/pytorch_backend/mt.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet2_for_PyTorch/espnet/lm/pytorch_backend/lm.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet2_for_PyTorch/espnet/lm/pytorch_backend/lm.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet2_for_PyTorch/espnet/lm/lm_utils.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/lm_utils.py | ESPnet2_for_PyTorch/espnet/lm/lm_utils.py | http://docs.h5py.org/en/stable/special.html#arbitrary-vlen-data | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet2_for_PyTorch/espnet/lm/chainer_backend/lm.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/st_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/st_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/mt_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/mt_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet2_for_PyTorch/espnet/bin/lm_train.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/lm_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/asr_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet2_for_PyTorch/espnet/bin/asr_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/asr_recog.py | ESPnet2_for_PyTorch/espnet/bin/asr_recog.py | https://arxiv.org/abs/2005.08700 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/recog.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/recog.py | https://github.com/espnet/espnet/pull/3616 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/asr_mix.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/pull/1388 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/issues/777 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/pull/2171 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet2_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/pytorch/pytorch/issues/27963 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/zeroth_korean/asr1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs2/zeroth_korean/asr1/local/download_and_untar.sh | http://www.openslr.org/resources/40/zeroth_korean.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/zeroth_korean/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yoloxochitl_mixtec/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/local/data.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yoloxochitl_mixtec/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/local/data.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yesno/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/yesno/asr1/local/data.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/yesno/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | ESPnet2_for_PyTorch/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | https://www.merl.com/demos/deep-clustering/spatialize_wsj0-mix.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | ESPnet2_for_PyTorch/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | https://github.com/ehabets/RIR-Generator | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/wsj0_2mix_spatialized/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wsj0_2mix/enh1/local/wsj0_create_mixture.sh | ESPnet2_for_PyTorch/egs2/wsj0_2mix/enh1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/wsj0_2mix/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wham/enh1/local/wham_create_mixture.sh | ESPnet2_for_PyTorch/egs2/whamr/enh1/local/whamr_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/whamr/enh1/local/whamr_create_mixture.sh | ESPnet2_for_PyTorch/egs2/whamr/enh1/local/whamr_create_mixture.sh | https://storage.googleapis.com/whisper-public/whamr_scripts.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/whamr/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wham/enh1/local/wham_create_mixture.sh | ESPnet2_for_PyTorch/egs2/wham/enh1/local/wham_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wham/enh1/local/wham_create_mixture.sh | ESPnet2_for_PyTorch/egs2/wham/enh1/local/wham_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_scripts.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/wham/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/wenetspeech/asr1/local/data.sh | https://wenet-e2e.github.io/WenetSpeech/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/wenetspeech/asr1/local/data.sh | https://wenet-e2e.github.io/WenetSpeech/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/wenetspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/voxforge/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vivos/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/vivos/asr1/local/data.sh | https://ailab.hcmus.edu.vn/assets/vivos.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/vivos/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vctk_noisyreverb/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/vctk_noisyreverb/enh1/local/data.sh | https://doi.org/10.7488/ds/2139 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vctk_noisy/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/vctk_noisyreverb/enh1/local/data.sh | https://doi.org/10.7488/ds/2117 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/vctk_noisyreverb/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vctk_noisy/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/vctk_noisy/enh1/local/data.sh | https://doi.org/10.7488/ds/2117 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/vctk_noisy/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vctk/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/vctk/tts1/local/data_download.sh | http://www.udialogue.org/download/VCTK-Corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vctk/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/vctk/tts1/local/data_download.sh | https://github.com/kan-bayashi/VCTKCorpusFullContextLabel.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/vctk/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tsukuyomi/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/tsukuyomi/tts1/local/data_download.sh | https://tyc.rei-yumesaki.net/files/sozai-tyc-corpus1.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/tsukuyomi/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/totonac/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://www.openslr.org/resources/107/Amith-Lopez_Totonac-recordings-northern-Puebla-and-adjacent-Veracruz_Metadata.xml | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/totonac/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://www.openslr.org/resources/107/Totonac_Corpus.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/totonac/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://github.com/ftshijt/Totonac_Split.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/totonac/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/timit/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs2/thchs30/tts1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/thchs30/tts1/local/data.sh | https://www.openslr.org/resources/18/data_thchs30.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/thchs30/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/thchs30/asr1/local/data.sh | https://www.openslr.org/resources/18/data_thchs30.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/thchs30/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://zenodo.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/asr1/asr.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://zenodo.org/account/settings/applications/tokens/new/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://github.com/espnet/espnet/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tsukuyomi/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://github.com/espnet/espnet_model_zoo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/path.sh | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html#nccl-socket-ifname | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librilight_limited/asr1/conf/tuning/train_asr_hubert_base_10h_finetuning.yaml | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/scripts/km.sh | https://dl.fbaipublicfiles.com/hubert/hubert_base_ls960.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/ssl1/pyscripts/sklearn_km.py | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/pyscripts/sklearn_km.py | https://github.com/pytorch/fairseq/blob/master/examples/hubert/simple_kmeans/learn_kmeans.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/pyscripts/sklearn_km.py | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/pyscripts/sklearn_km.py | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/ssl1/pyscripts/feature_loader.py | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/pyscripts/feature_loader.py | https://github.com/pytorch/fairseq/blob/master/examples/hubert/simple_kmeans/dump_mfcc_feature.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/pyscripts/feature_loader.py | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/pyscripts/feature_loader.py | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/path.sh | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html#nccl-socket-ifname | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/hubert.sh | https://arxiv.org/pdf/2106.07447.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/ssl1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/hubert.sh | https://github.com/pytorch/fairseq/tree/master/examples/hubert | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/enh1/scripts/utils/enhance_dataset.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/scripts/utils/enhance_dataset.sh | https://github.com/espnet/espnet/pull/3226 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/path.sh | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html#nccl-socket-ifname | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://zenodo.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/asr1/asr.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://zenodo.org/account/settings/applications/tokens/new/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://github.com/espnet/espnet/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tsukuyomi/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://github.com/espnet/espnet_model_zoo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/enh.sh | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/path.sh | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html#nccl-socket-ifname | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://zenodo.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/asr1/asr.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://zenodo.org/account/settings/applications/tokens/new/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://github.com/espnet/espnet/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tsukuyomi/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://github.com/espnet/espnet_model_zoo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/diar.sh | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/diar1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/scripts/utils/upload_models_to_hub.sh | https://huggingface.co/espnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/path.sh | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html#nccl-socket-ifname | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://zenodo.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/asr1/asr.sh | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://zenodo.org/account/settings/applications/tokens/new/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://github.com/espnet/espnet/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tsukuyomi/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://github.com/espnet/espnet_model_zoo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/README.md | ESPnet2_for_PyTorch/egs2/TEMPLATE/asr1/asr.sh | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tedlium2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs2/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/tedlium2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/swbd_da/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/swbd_da/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/swbd_da/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/swbd_da/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/swbd_da/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/swbd_da/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/local/data_prep.py | http://nite.sourceforge.net/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/swbd_da/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/su_openslr36/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/su_openslr36/asr1/local/data.sh | https://www.openslr.org/resources/36/asr_sundanese_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/spgispeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/speechcommands/asr1/README.md | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data_prep_35.py | https://arxiv.org/abs/1804.03209 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/speechcommands/asr1/README.md | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data_prep_12.py | https://arxiv.org/abs/1804.03209 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data_prep_12.py | https://www.tensorflow.org/datasets/catalog/speech_commands | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/speechcommands/asr1/README.md | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | https://arxiv.org/abs/1804.03209 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/speechcommands/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | http://download.tensorflow.org/data/speech_commands_v0.02.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/speechcommands/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | http://download.tensorflow.org/data/speech_commands_test_set_v0.02.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/speechcommands/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/snips/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/data.sh | https://github.com/fgnt/sms_wsj | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/sms_wsj/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/data.sh | https://github.com/mpariente/asteroid/blob/master/egs/sms_wsj/CaCGMM/local/prepare_data.sh | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/sms_wsj/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/data.sh | https://github.com/fgnt/sms_wsj.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/sms_wsj/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/data.sh | https://github.com/boeddeker/rir-generator.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/sms_wsj/enh1/local/create_database.sh | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/create_database.sh | https://zenodo.org/record/3517889/files/sms_wsj.tar.gz.parta | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/sms_wsj/enh1/local/create_database.sh | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/create_database.sh | https://zenodo.org/record/3517889/files/sms_wsj.tar.gz.parta | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/slurp_entity/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/slurp/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/slue-voxceleb/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/siwis/tts1/local/data_prep.sh | ESPnet2_for_PyTorch/egs2/siwis/tts1/local/data_prep.sh | https://stackoverflow.com/questions/43638993/bash-remove-all-unicode-spaces-and-replace-with-normal-space | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/siwis/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/siwis/tts1/local/data_download.sh | https://datashare.ed.ac.uk/download/DS_10283_2353.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/siwis/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ruslan/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/ruslan/tts1/local/data_download.sh | https://drive.google.com/uc?id=1Y6vv--gcDx-S8DieSGaD7WnB86kZLgc_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ruslan/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/ruslan/tts1/local/data_download.sh | https://drive.google.com/uc?id=11TD_ZwIOo-Wo75GYv-OWWOS3ABmwmAdK | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/ruslan/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/ru_open_stt/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/reverb/asr1/local/prepare_rir_noise_1ch.sh | ESPnet2_for_PyTorch/egs2/reverb/asr1/local/prepare_rir_noise_1ch.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/reverb/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://github.com/ftshijt/Puebla_Nahuatl_Split.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/primewords_chinese/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/primewords_chinese/asr1/local/data.sh | https://www.openslr.org/resources/47/primewords_md_2018_set1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/primewords_chinese/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/polyphone_swiss_french/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-3/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/nsc/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/nsc/asr1/local/data.sh | https://github.com/pzelasko/kaldi/blob/feature/nsc-recipe/egs/nsc/s5/local/nsc_data_prep.sh | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/nsc/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/nsc/asr1/local/data.sh | https://github.com/pzelasko/Praat-textgrids | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/nsc/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/nsc/asr1/local/data.sh | https://github.com/pzelasko/kaldi/blob/feature/nsc-recipe/egs/nsc/s5/local/nsc_data_prep.sh#L30-L35 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/nsc/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mucs21_subtask2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/path.sh | ESPnet2_for_PyTorch/egs2/mucs21_subtask1/asr1/path.sh | https://docs.nvidia.com/deeplearning/sdk/nccl-developer-guide/docs/env.html#nccl-socket-ifname | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mucs21_subtask1/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/mucs21_subtask1/asr1/local/data.sh | https://navana-tech.github.io/MUCS2021/data.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mucs21_subtask1/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs2/mls/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mls/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/mls/asr1/local/data.sh | https://dl.fbaipublicfiles.com/mls/mls_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mls/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/mls/asr1/local/data.sh | https://dl.fbaipublicfiles.com/mls/mls_lm_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/misp2021/avsr1/local/data.sh | ESPnet2_for_PyTorch/egs2/misp2021/avsr1/local/data.sh | https://bit.ly/3glF4 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/misp2021/avsr1/local/data.sh | ESPnet2_for_PyTorch/egs2/misp2021/avsr1/local/data.sh | https://github.com/mispchallenge/misp2021_baseline.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/misp2021/avsr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/misp2021/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/simulation/random_mixture_nooverlap.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/simulation/random_mixture.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/simulation/make_mixture_nooverlap.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/simulation/make_mixture.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/simulation/common.py | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | https://github.com/hitachi-speech/EEND | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | http://www.openslr.org/resources/31 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | http://www.openslr.org/resources/26/sim_rir_8k.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/mini_librispeech/diar1/local/data.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | https://www.openslr.org/resources/17/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mini_an4/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mini_an4/ssl1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mini_an4/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/mini_an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/lrs2/lipreading1/local/data.sh | ESPnet2_for_PyTorch/egs2/lrs2/lipreading1/local/data.sh | lichenda1996@sjtu.edu.cn | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/lrs2/lipreading1/local/data.sh | ESPnet2_for_PyTorch/egs2/lrs2/lipreading1/local/data.sh | https://zenodo.org/record/5090353/files/lipread_lrw_pretrain.pt.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/lrs2/lipreading1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/ljspeech/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/libritts/tts1/local/data.sh | ESPnet2_for_PyTorch/egs2/libritts/tts1/local/data.sh | https://github.com/kan-bayashi/LibriTTSCorpusLabel.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/libritts/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/librispeech/ssl1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/librispeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/librimix/enh1/local/data.sh | https://github.com/JorisCos/LibriMix | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wham/enh1/local/wham_create_mixture.sh | ESPnet2_for_PyTorch/egs2/librimix/enh1/local/data.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/librimix/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librilight_limited/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/librilight_limited/asr1/local/data.sh | https://dl.fbaipublicfiles.com/librilight/data/librispeech_finetuning.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/librilight_limited/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/laborotv/asr1/local/data.sh | https://github.com/laboroai/TEDxJP-10K | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/laborotv/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/kss/tts1/local/data.sh | ESPnet2_for_PyTorch/egs2/kss/tts1/local/data.sh | https://bit.ly/376oCzY | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/kss/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | https://github.com/kaldi-asr/kaldi/blob/master/src/bin/align-text.cc | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/ksponspeech/asr1/local/data.sh | https://aihub.or.kr/aidata/105 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/ksponspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jvs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jv_openslr35/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/jv_openslr35/asr1/local/data.sh | https://www.openslr.org/resources/35/asr_javanese_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jtubespeech/tts1/local/download.sh | ESPnet2_for_PyTorch/egs2/jtubespeech/tts1/local/download.sh | https://drive.google.com/uc?id=1X_harC0e1tjMX1FtCldD67XOysQuq_Ib | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jtubespeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jsut/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jsut/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jsss/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/jsss/tts1/local/data_download.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1NyiZCXkYTdYBNtD1B-IMAYCVa-0SQsKX | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jsss/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/jsss/tts1/local/data_download.sh | https://github.com/kan-bayashi/JSSSLabel | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jsss/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jmd/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/jmd/tts1/local/data_download.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1gacw6Ak6rlEZ_gx9KwafIIfc3dU0EAHW | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jmd/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/jmd/tts1/local/data_download.sh | https://drive.google.com/a/g.sp.m.is.nagoya-u.ac.jp/uc?id=1mCbmUKVifEEEcm7A3ofqWW7dCqVXGrsh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/jmd/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/jmd/tts1/local/data_download.sh | https://github.com/takenori-y/JMDComplements | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jmd/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jkac/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/jdcinal/asr1/local/data.sh | http://tts.speech.cs.cmu.edu/awb/infomation_navigation_and_attentive_listening_0.2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/jdcinal/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/iwslt21_low_resource/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/local/data.sh | https://zenodo.org/record/4541727/files/asr_train_asr_conformer_raw_ru_bpe100_valid.acc.ave.zip?download=1 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/iwslt21_low_resource/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/local/data.sh | https://zenodo.org/record/5227612/files/swahili-asr-resources.tar.xz?download=1 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/indic_speech/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/iemocap/asr1/local/data.sh | https://sail.usc.edu/iemocap/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/iemocap/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/hui_acg/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/hui_acg/tts1/local/data_download.sh | https://opendata.iisys.de/systemintegration/Datasets/HUI-Audio-Corpus-German/dataset_clean | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/hui_acg/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/how2/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/how2/asr1/local/data.sh | http://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/how2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/hkust/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/grabo/asr1/local/data_prep.py | https://www.esat.kuleuven.be/psi/spraak/downloads/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/grabo/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/grabo/asr1/local/data_prep.py | https://arxiv.org/pdf/1805.02922.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/grabo/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/grabo/asr1/local/data_prep.py | https://arxiv.org/pdf/2008.01994.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/grabo/asr1/local/data_prep.py | ESPnet2_for_PyTorch/egs2/grabo/asr1/local/data_prep.py | https://arxiv.org/pdf/2008.01994.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/grabo/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/grabo/asr1/local/data.sh | ftp://ftp.esat.kuleuven.be/psi/speech/vrenkens/grabo.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/grabo/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/gigaspeech/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/gigaspeech/asr1/local/data.sh | https://github.com/SpeechColab/GigaSpeech.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/gigaspeech/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/gigaspeech/asr1/local/data.sh | https://github.com/SpeechColab/GigaSpeech#dataset-download | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/gigaspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/fsc_unseen/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/fsc_challenge/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/fsc/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dns_ins20/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/dns_ins20/enh1/local/data.sh | https://github.com/microsoft/DNS-Challenge.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/dns_ins20/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/data.sh | https://github.com/SHINE-FBK/DIRHA_English_phrich | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/data.sh | https://github.com/SHINE-FBK/DIRHA_English_phrich | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/css10/tts1/README.md | ESPnet2_for_PyTorch/egs2/css10/tts1/local/data.sh | https://github.com/Kyubyong/css10 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/css10/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/csmsc/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/csj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/commonvoice/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-3/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/commonvoice/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/commonvoice/asr1/local/data.sh | https://commonvoice.mozilla.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/commonvoice/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_indic/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/cmu_indic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_arctic/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs2/cmu_arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/cmu_arctic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/chime4/enh1/local/CHiME3_simulate_data_patched_parallel.m | ESPnet2_for_PyTorch/egs2/chime4/enh1/local/localize.m | http://www.gnu.org/licenses/gpl.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/chime4/enh1/local/data.sh | ESPnet2_for_PyTorch/egs2/chime4/enh1/local/data.sh | http://spandh.dcs.shef.ac.uk/chime_challenge/CHiME4/download.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/chime4/enh1/local/CHiME3_simulate_data_patched_parallel.m | ESPnet2_for_PyTorch/egs2/chime4/enh1/local/CHiME3_simulate_data_patched_parallel.m | http://www.gnu.org/licenses/gpl.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/chime4/enh1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/chime4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/catslu/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/babel/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/an4/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs2/an4/asr1/local/data.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/ami/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell4/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/aishell4/asr1/local/data.sh | https://github.com/felixfuyihui/AISHELL-4.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell4/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/aishell4/asr1/local/data.sh | https://github.com/DanBerrebbi/AISHELL-4.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell4/asr1/local/data.sh | ESPnet2_for_PyTorch/egs2/aishell4/asr1/local/data.sh | https://www.openslr.org/resources/111/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/aishell4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs2/aishell3/tts1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell3/tts1/local/data.sh | ESPnet2_for_PyTorch/egs2/aishell3/tts1/local/data.sh | https://www.openslr.org/resources/93/data_aishell3.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/aishell3/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/aishell/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs2/aidatatang_200zh/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yoloxochitl_mixtec/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yoloxochitl_mixtec/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yesno/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/yesno/asr1/run.sh | ESPnet2_for_PyTorch/egs/yesno/tts1/run.sh | http://sourceforge.net/projects/kaldi/files/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/yesno/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/yesno/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/yesno/asr1/run.sh | ESPnet2_for_PyTorch/egs/yesno/asr1/run.sh | http://sourceforge.net/projects/kaldi/files/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/yesno/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/wsj0_2mix/enh1/local/wsj0_create_mixture.sh | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/wsj_mix/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/voxforge/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/vivos/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/vivos/asr1/run.sh | https://ailab.hcmus.edu.vn/assets/vivos.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vivos/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/voc1/local/make_subset_data.sh | ESPnet2_for_PyTorch/egs/vcc20/voc1/local/make_subset_data.sh | https://opensource.org/licenses/MIT | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vcc20/voc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_mandarin.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_german.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_finnish.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/README.md | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://github.com/espnet/espnet/blob/master/egs/librispeech/asr1/RESULTS.md#pytorch-large-transformer-with-specaug-4-gpus--large-lstm-lm | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libritts/tts1/RESULTS.md | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1UvtFkqdkE8bOCKWXlEltc746JsCKaTMX | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1XYpBZe9-9AgAxGpKfrgQPDjlW2S6duac | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1E6vzNaXT6r7Zybefat_p9ncnMOQCXtem | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11qOvuMGP76BEe_pcPgYdqnWi05MIIYTA | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1y6IFgLMatjh9wspwu-oBba-rPOH1zKlS | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=16Q3XOAfI5tG0LZ0SIKE166N3RCtzO722 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1EET2qhBi6nl0DH7UEg0Ez9SfgDX-cFM- | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Vd4Qa8Dm9UQ-LZbyNPRiqoSgOZkGQsQi | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1bvyMfA-zKfO2LEogq-QXhHQeETxdBU29 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rA9ucA-VvhWkcFsGG6izBt2USOZY1_g6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1QfqwnTK0BKO0z_eYqltzL_MeqVGrMiZg | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kWBYSkvaQ0-7CwOfjVaWQYF0vEm0rNyS | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=13xDOSo53BSQoF1kD27SdwXoAGqtjtIEM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11KKux-du6fvsMMB4jNk9YH23YUJjRcDV | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1li9DLZGnAheWZrB4oXGo0KWq-fHuFH_l | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/vc1_task1/local/data_download.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/data_download.sh | https://github.com/nii-yamagishilab/VCC2020-database.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/local/clean_text_asr_result.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/clean_text_mailabs.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/local/clean_text_css10.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vais1000/tts1/local/download.sh | ESPnet2_for_PyTorch/egs/vais1000/tts1/local/download.sh | https://drive.google.com/open?id=1HHhLuYhrkk3J6OJctZvgaSd0UgiaROwG | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/vais1000/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tweb/README.md | ESPnet2_for_PyTorch/egs/tweb/tts1/run.sh | https://www.kaggle.com/bryanpark/the-world-english-bible-speech-dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/tweb/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/run.sh | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/run.sh | ftp://ftp.espci.fr/pub/sigma/Features/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/run.sh | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ftp://ftp.espci.fr/pub/sigma/Features/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/TIMIT_training/TIMIT_Transcripts.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/WSJ05K_Test/WSJ0_5K_Transcripts.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/timit_ssc/ssr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/timit/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium3/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/tedlium3/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tedlium2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/tedlium2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/asr1/RESULTS.md | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/tedlium2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/tedlium2/align1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_fix_speakerid.pl | ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_fix_speakerid.pl | pengqi@cs.stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_prep.sh | ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_download.sh | ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_download.sh | ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet2_for_PyTorch/egs/swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet2_for_PyTorch/egs/swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/releases/download/v0.5-beta/public_exclude_file_v5.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/files/3386441/exclude_df_youtube_1120.zip | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_SimData_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_RealData_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/prog/score_sim_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/prog/score_real_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.org/tools/taskFiles_et.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/reverb/asr1_multich/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/RESULTS | ESPnet2_for_PyTorch/egs/reverb/asr1/RESULTS | https://reverb2014.dereverberation.com/result_asr.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/RESULTS | ESPnet2_for_PyTorch/egs/reverb/asr1/RESULTS | https://reverb2014.dereverberation.com/result_asr.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/prepare_simu_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/prepare_real_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/Generate_mcTrainData_cut.m | ESPnet2_for_PyTorch/egs/reverb/asr1/local/Generate_mcTrainData_cut.m | http://stevem.us/fconv.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/generate_data.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/reverb/asr1/local/prepare_rir_noise_1ch.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/generate_data.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/download_se_eval_tool.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/reverb/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/puebla_nahuatl/st1/run.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/puebla_nahuatl/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/polyphone_swiss_french/asr1/local/data_prep.py | http://catalog.elra.info/en-us/repository/browse/ELRA-S0030_02 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/polyphone_swiss_french/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/must_c_v2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/must_c_v2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/must_c_v2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Mf2il_VelDIJMSio0bq7I8M9fSs-X4Ie | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=14d2ttsuEUFXsxx-KRWJMsFhQGrYOJcpH | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1acIBqcPVX5QXXXV9u8_yDPtCgfsdEJDV | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1qbK88SAKxqjMUybkMeIjrJWnNAZyE8V0 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=11fNraDQs-LiODDxyV5ZW0Slf3XuDq5Cf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1C5qK1FckA702nsYcXwmGdzlMmHg1F_ot | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1nbdYR5VqcTbLpOB-9cICKCgsLAs7fVzd | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Z3hSiP7fsR3kf8fjQYzIa07jmw4KXNnw | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/must_c/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/file/d/1UBPNwFEVhIZCOEpu4hTqPji57XRg85UO | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/must_c/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/must_c/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/must_c/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask2/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_test.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask1/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask1/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask1/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask1/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask1/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mucs21_subtask1/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mtedx/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mtedx/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mini_an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/run.sh | ESPnet2_for_PyTorch/egs/mgb2/asr1/run.sh | https://arabicspeech.org/mgb2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/local/xml2stm.py | ESPnet2_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/run.sh | ESPnet2_for_PyTorch/egs/mgb2/asr1/local/mgb_extract_data.sh | https://arabicspeech.org/mgb2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mgb2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mboshi_french/st1/local/data_prep.sh | ESPnet2_for_PyTorch/egs/mboshi_french/st1/local/data_prep.sh | https://github.com/besacier/mboshi-french-parallel-corpus | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/mboshi_french/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/m_ailabs/tts1/local/download.sh | ESPnet2_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/m_ailabs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/ljspeech/tts2/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts2/RESULTS | ESPnet2_for_PyTorch/egs/ljspeech/tts2/RESULTS | https://drive.google.com/drive/folders/1AMAKY8uQY59-DL5KNjrPSoosu4sftJPn?usp=sharing | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/ljspeech/tts2/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/ljspeech/tts1/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/ljspeech/tts1/local/ob_eval/evaluate_cer.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/ljspeech/tts1/local/clean_text.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/ljspeech/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet2_for_PyTorch/egs/ljspeech/asr1/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/ljspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/libritts/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/librispeech/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/librispeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet2_for_PyTorch/egs/libri_trans/st1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/local/data_prep.sh | ESPnet2_for_PyTorch/egs/libri_trans/st1/local/data_prep.sh | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/prepare-raw.sh | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/libri_trans/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet2_for_PyTorch/egs/libri_trans/mt1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/libri_trans/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet2_for_PyTorch/egs/libri_trans/asr1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/libri_trans/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/run.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/run.sh | https://github.com/espnet/espnet/blob/master/egs/librispeech/asr1/RESULTS.md#pytorch-large-transformer-with-specaug-4-gpus--transformer-lm-4-gpus | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/run.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/wer_output_filter | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/segmentation/apply_webrtcvad.py | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/segmentation/apply_webrtcvad.py | https://github.com/wiseman/py-webrtcvad/blob/master/example.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/download_xvector.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/download_xvector.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | https://desh2608.github.io/static/files/jsalt/plda | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/download_asr.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/diarize.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/nryant/dscore | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/diarize.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/desh2608/dscore.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/data_download.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&confirm= | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/data_download.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&id=1Piioxd5G_85K9Bhcr8ebdhXx0CnaHy7l | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://arxiv.org/abs/1910.08847 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | http://www.fit.vutbr.cz/research/groups/speech/publi/2019/diez_IEEE_ACM_2019_08910412.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://github.com/desh2608/kaldi-io-for-python.git@vbx | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.py | ESPnet2_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.py | https://github.com/BUTSpeechFIT/VBx | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/VB_diarization.py | ESPnet2_for_PyTorch/egs/libri_css/asr1/diarization/VB_diarization.py | burget@fit.vutbr. | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/libri_css/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/li42/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/li10/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/ksponspeech/asr1/run.sh | https://aihub.or.kr/aidata/105 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/egs/ksponspeech/asr1/local/get_space_normalized_hyps.py | https://github.com/kaldi-asr/kaldi/blob/master/src/bin/align-text.cc | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/ksponspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jvs/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jvs/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jvs/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs/jvs/tts1/local/data_download.sh | https://drive.google.com/open?id=19oAw8wWn3Y7z6CKChRdAyGOB9yupL_Xt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jvs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jsut/tts1/local/prep_segments.py | ESPnet2_for_PyTorch/egs/jsut/tts1/local/prep_segments.py | https://kaldi-asr.org/doc/data_prep.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jsut/tts1/local/download.sh | ESPnet2_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jsut/tts1/local/download.sh | ESPnet2_for_PyTorch/egs/jsut/tts1/local/download.sh | https://github.com/r9y9/jsut-lab | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jsut/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jsut/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jsalt18e2e/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet2_for_PyTorch/egs/jnas/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jnas/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jnas/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jesc/mt1/local/download_data.sh | ESPnet2_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/jesc/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/iwslt21_low_resource/st1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt21_low_resource/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/iwslt21_low_resource/asr1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt21_low_resource/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt21/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt21/punc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt21/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt21/asr1/run.sh | ESPnet2_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt21/asr1/local/data_prep_wmt20.sh | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt21/asr1/local/data_prep_wmt20.sh | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset10m.incl_paracrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt21/asr1/local/data_prep_wmt20.sh | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt21/asr1/local/data_prep_wmt20.sh | ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://github.com/saffsd/langid.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt21/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt19/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt19/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/segmented/IWSLT-SLT.segmented.tst2019.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/segmented/IWSLT-SLT.segmented.tst2020.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/data_prep_train.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/data_prep_train.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/data_prep_train.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/local/data_prep_eval.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt18/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt18/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt18/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt16/mt1/RESULTS.md | ESPnet2_for_PyTorch/egs/iwslt16/mt1/local/train_and_apply_bpe.sh | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt16/mt1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/iwslt16/mt1/local/download_and_untar.sh | https://wit3.fbk.eu/archive/2016-01/texts/en/de/en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/iwslt16/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/hub4_spanish/asr1/local/write_kaldi_files.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_training_text.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_test_text.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_data.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/hub4_spanish/asr1/local/parse_sgm.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/hub4_spanish/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/how2/st1/local/data_prep_test.sh | ESPnet2_for_PyTorch/egs/how2/st1/local/data_prep_test.sh | https://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/how2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/how2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/how2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/hkust/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_prep.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_download.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_download.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/fisher_callhome_spanish/st1/local/normalize_trans.sh | ESPnet2_for_PyTorch/egs/fisher_callhome_spanish/st1/local/normalize_trans.sh | https://github.com/joshua-decoder/fisher-callhome-corpus.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/fisher_callhome_spanish/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/fisher_callhome_spanish/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/fisher_callhome_spanish/asr1b/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/fisher_callhome_spanish/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/readsph.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/readsph.m | http://www.ee.ic.ac.uk/hp/staff/dmb/voicebox/voicebox.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/readsph.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/readsph.m | http://www.gnu.org/copyleft/gpl.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/read_sphere.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/linear_shift.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/find_files.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | https://www.ldc.upenn.edu/language-resources/tools/sphere-conversion-tools | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/tools/create_folder_str.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dipco/asr1/local/download_data.sh | ESPnet2_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/dipco/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/csmsc/tts1/local/data_download.sh | https://www.data-baker.com/open_source.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/csmsc/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/csj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/asr1/RESULTS.md | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/csj/asr1/RESULTS.md | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/csj/align1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/local/process_tsv.py | ESPnet2_for_PyTorch/egs/covost2/st1/local/process_tsv.py | https://github.com/facebookresearch/covost/blob/master/get_covost_splits.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/covost2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/covost2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/covost2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-3/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet2_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet2_for_PyTorch/egs/commonvoice/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/commonvoice/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_indic/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/cmu_indic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/chime6/asr1/local/wer_output_filter | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://spandh.dcs.shef.ac.uk/chime_challenge/data.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://spandh.dcs.shef.ac.uk/chime_workshop/papers/CHiME_2018_paper_boeddecker.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/install_pb_chime5.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://github.com/fgnt/pb_chime5.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/install_pb_chime5.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://stackoverflow.com/a/3796947/5766934 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/generate_chime6_data.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/generate_chime6_data.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet2_for_PyTorch/egs/chime6/asr1/local/check_tools.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/chime6/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/chime5/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/chime4/asr1_multich/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/aurora4/asr1/local/aurora4_data_prep.sh | ESPnet2_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/chime4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/blizzard17/tts1/run.sh | ESPnet2_for_PyTorch/egs/blizzard17/tts1/run.sh | http://www.cstr.ed.ac.uk/projects/blizzard/2017/usborne_blizzard2017/license.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/blizzard17/tts1/local/download.sh | ESPnet2_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/blizzard17/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/babel/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/aurora4/asr1/local/aurora4_data_prep.sh | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | ESPnet2_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/aurora4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/path.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/path.sh | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mPf-BxX3t_pqFFV6MGPBRePm5kgNR5sM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1fRLw6EA0x55xa449i_YRjCgm8sgv3hJI | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1v70TtwfmYtTHq9LvksX907mNTEv1G-J1 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1ty_de85SNldzVJSMQrHwl1ASBdGdSRav | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/local/ob_eval/evaluate.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/arctic/vc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/cmu_indic/tts1/local/pretrained_model_download.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_arctic/tts1/local/data_download.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/arctic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/an4/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet2_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_xml2text.sh | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://sourceforge.net/projects/nite/files/nite/nxt_1.4.4/nxt_1.4.4.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_xml2text.sh | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ami/asr1/conf/ami_beamformit.cfg | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ami/asr1/conf/ami_beamformit.cfg | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_download.sh | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_download.sh | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/Creative-Commons-Attribution-NonCommercial-ShareAlike-2.5.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_beamform.sh | ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_beamform.sh | http://groups.inf.ed.ac.uk/ami/corpus/dataproblems.shtml | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/ami/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/aishell2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/aishell/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet2_for_PyTorch/egs/aidatatang_200zh/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/kaldi-asr/kaldi | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/runtime/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/espnet/kaldi-bin/releases/download/v0.0.1/ubuntu16-featbin.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/runtime/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/path.sh | ESPnet2_for_PyTorch/docker/prebuilt/Dockerfile | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/devel/cudnn7/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/blob/master/dist/11.1.1/ubuntu20.04-x86_64/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/blob/master/dist/11.1.1/ubuntu20.04-x86_64/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/issues/88 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/runtime/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/issues/88 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet2_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/aishell/asr1/README.md | ESPnet2_for_PyTorch/docker/prebuilt/devel/Dockerfile | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/doc/make.bat | ESPnet2_for_PyTorch/doc/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/doc/conf.py | ESPnet2_for_PyTorch/doc/conf.py | https://qiita.com/pashango2/items/d1b379b699af85b529ce | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/doc/conf.py | ESPnet2_for_PyTorch/doc/conf.py | https://github.com/rtfd/recommonmark/tree/master/doc/ | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/doc/conf.py | ESPnet2_for_PyTorch/doc/conf.py | http://alabaster.readthedocs.io/en/latest/installation.html#sidebars | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/ci/test_shell.sh | ESPnet2_for_PyTorch/ci/test_utils.sh | https://github.com/bats-core/bats-core.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/kaldi-asr/kaldi | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/ci/test_shell.sh | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/bats-core/bats-core.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/ci/test_shell.sh | ESPnet2_for_PyTorch/ci/test_shell.sh | https://github.com/koalaman/shellcheck/releases/download/stable/shellcheck-stable.linux.x86_64.tar.xz | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/ci/test_integration_espnet2.sh | ESPnet2_for_PyTorch/ci/test_integration_espnet2.sh | https://github.com/pytorch/pytorch/issues/42446 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/ci/install_kaldi.sh | https://github.com/kaldi-asr/kaldi | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/runtime/Dockerfile | ESPnet2_for_PyTorch/ci/install_kaldi.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.1/ubuntu16-featbin.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/ci/install.sh | ESPnet2_for_PyTorch/ci/install.sh | https://github.com/kpu/kenlm/archive/master.zip | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/ci/install.sh | ESPnet2_for_PyTorch/ci/install.sh | https://github.com/psf/black/issues/1707 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet2_for_PyTorch/ci/doc.sh | https://github.com/kaldi-asr/kaldi | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/.gitmodules | ESPnet2_for_PyTorch/.gitmodules | https://github.com/espnet/notebook | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/aidatatang_200zh/asr1/run.sh | www.openslr.org/resources/62 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/aishell/asr1/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/$annots.gzip | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict $dir/cmudict | 相关设置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/csmsc/tts1/local/data_download.sh | https://www.data-baker.com/open_source.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt21_low_resource/asr1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/iwslt21_low_resource/st1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/jnas/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/librispeech/asr1/run.sh | www.openslr.org/resources/12 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/libritts/tts1/run.sh | www.openslr.org/resources/60 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/${lang}.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mgb2/asr1/local/mgb_extract_data.sh | https://arabicspeech.org/mgb2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 Sound-Files-Puebla-Nahuatl.tgz.part0 9 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz SpeechTranslation_Nahuatl_Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 Sound-Files-Puebla-Nahuatl.tgz.part0 9 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.org/tools/taskFiles_et.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/${f} | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/aidatatang_200zh/asr1/local/data.sh | www.openslr.org/resources/62 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/aishell/asr1/local/data.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/aishell3/tts1/local/data.sh | https://www.openslr.org/resources/93/data_aishell3.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/aishell4/asr1/local/data.sh | https://www.openslr.org/resources/111/$room_name.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/an4/asr1/local/data.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/cmu_arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/commonvoice/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/dirha_wsj/asr1/local/wsj_data_prep_dirha.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/how2/asr1/local/data.sh | http://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/hui_acg/tts1/local/data_download.sh | https://opendata.iisys.de/systemintegration/Datasets/HUI-Audio-Corpus-German/dataset_clean | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/local/data.sh | https://zenodo.org/record/5227612/files/swahili-asr-resources.tar.xz?download=1 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/iwslt21_low_resource/asr1/local/data.sh | https://zenodo.org/record/4541727/files/asr_train_asr_conformer_raw_ru_bpe100_valid.acc.ave.zip?download=1 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/jdcinal/asr1/local/data.sh | http://tts.speech.cs.cmu.edu/awb/infomation_navigation_and_attentive_listening_0.2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/jv_openslr35/asr1/local/data.sh | https://www.openslr.org/resources/35/asr_javanese_${i}.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librilight_limited/asr1/local/data.sh | https://dl.fbaipublicfiles.com/librilight/data/librispeech_finetuning.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librilight_limited/asr1/local/data.sh | www.openslr.org/resources/12 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librimix/enh1/local/data.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librispeech/asr1/conf/tuning/train_asr_transformer3_w2v_large_lv60_960h_finetuning_last_1layer.yaml | https://dl.fbaipublicfiles.com/fairseq/wav2vec/wav2vec2_vox_960h_new.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | www.openslr.org/resources/12 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/librispeech/asr1/local/data.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/libritts/tts1/local/data.sh | www.openslr.org/resources/60 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/lrs2/lipreading1/local/data.sh | https://zenodo.org/record/5090353/files/lipread_lrw_pretrain.pt.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | http://www.openslr.org/resources/26/sim_rir_8k.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | https://www.openslr.org/resources/17/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/mini_librispeech/diar1/local/data.sh | http://www.openslr.org/resources/31 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/mls/asr1/local/data.sh | https://dl.fbaipublicfiles.com/mls/mls_lm_${download_id}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/mls/asr1/local/data.sh | https://dl.fbaipublicfiles.com/mls/mls_${download_id}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/data.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/open_li52/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/primewords_chinese/asr1/local/data.sh | https://www.openslr.org/resources/47/primewords_md_2018_set1.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/puebla_nahuatl/asr1/local/data.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/reverb/asr1/local/prepare_rir_noise_1ch.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/siwis/tts1/local/data_download.sh | https://datashare.ed.ac.uk/download/DS_10283_2353.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/sms_wsj/enh1/local/create_database.sh | https://zenodo.org/record/3517889/files/sms_wsj.tar.gz.parta{a,b,c,d,e} | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | http://download.tensorflow.org/data/speech_commands_test_set_v0.02.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/speechcommands/asr1/local/data.sh | http://download.tensorflow.org/data/speech_commands_v0.02.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/su_openslr36/asr1/local/data.sh | https://www.openslr.org/resources/36/asr_sundanese_${i}.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/TEMPLATE/ssl1/scripts/km.sh | https://dl.fbaipublicfiles.com/hubert/hubert_base_ls960.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/TEMPLATE/tts1/tts.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/thchs30/asr1/local/data.sh | https://www.openslr.org/resources/18/data_thchs30.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/thchs30/tts1/local/data.sh | https://www.openslr.org/resources/18/data_thchs30.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://www.openslr.org/resources/107/Amith-Lopez_Totonac-recordings-northern-Puebla-and-adjacent-Veracruz_Metadata.xml | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/totonac/asr1/local/data.sh | https://www.openslr.org/resources/107/Totonac_Corpus.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/tsukuyomi/tts1/local/data_download.sh | https://tyc.rei-yumesaki.net/files/sozai-tyc-corpus1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/vctk/tts1/local/data_download.sh | http://www.udialogue.org/download/VCTK-Corpus.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/vctk_noisy/enh1/local/data.sh | https://doi.org/10.7488/ds/2117 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/vctk_noisyreverb/enh1/local/data.sh | https://doi.org/10.7488/ds/2117 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/vctk_noisyreverb/enh1/local/data.sh | https://doi.org/10.7488/ds/2139 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/wham/enh1/local/wham_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_scripts.tar.gz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/wham/enh1/local/wham_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/whamr/enh1/local/whamr_create_mixture.sh | https://storage.googleapis.com/whisper-public/wham_noise.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/whamr/enh1/local/whamr_create_mixture.sh | https://storage.googleapis.com/whisper-public/whamr_scripts.tar.gz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/wsj0_2mix/enh1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/wsj0_2mix_spatialized/enh1/local/spatialize_wsj0_mix.sh | https://www.merl.com/demos/deep-clustering/spatialize_wsj0-mix.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/yesno/asr1/local/data.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/local/data.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/yoloxochitl_mixtec/asr1/local/data.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/egs2/zeroth_korean/asr1/local/download_and_untar.sh | http://www.openslr.org/resources/40/zeroth_korean.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/espnet/nets/pytorch_backend/transducer/arguments.py | https://arxiv.org/abs/2010.11148 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/espnet2/asr/encoder/wav2vec2_encoder.py | https://dl.fbaipublicfiles.com/fairseq/wav2vec/dict.ltr.txt | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/setup.py | shinjiw@ieee.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2 | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/utils/pack_model.sh | shinjiw@ieee.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet2_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/public_address_statement.md b/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/public_address_statement.md index 2a54e345002d562d67a810a25caaa444d02676de..e605d0cf3043a8c430726cc4e596fdc400089705 100644 --- a/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/public_address_statement.md @@ -1,682 +1,167 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------ |------|---------------------------------------------------------------------|------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/Dockerfile | https://github.com/espnet/espnet | 下载源码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/Dockerfile | https://github.com/cybertronai/pytorch-lamb | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/kaldi-asr/kaldi | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/espnet/kaldi-bin/releases/download/v0.0.1/ubuntu16-featbin.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/aidatatang_200zh/asr1/run.sh | www.openslr.org/resources/62 | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/aishell/asr1/run.sh | www.openslr.org/resources/33 | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/Creative-Commons-Attribution-NonCommercial-ShareAlike-2.5.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://sourceforge.net/projects/nite/files/nite/nxt_1.4.4/nxt_1.4.4.zip | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/$annots.gzip | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/local/ob_eval/evaluate.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mPf-BxX3t_pqFFV6MGPBRePm5kgNR5sM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1fRLw6EA0x55xa449i_YRjCgm8sgv3hJI | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1v70TtwfmYtTHq9LvksX907mNTEv1G-J1 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1ty_de85SNldzVJSMQrHwl1ASBdGdSRav | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://github.com/fgnt/pb_chime5.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/fisher_callhome_spanish/st1/local/normalize_trans.sh | https://github.com/joshua-decoder/fisher-callhome-corpus.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/how2/st1/local/data_prep_test.sh | https://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt16/mt1/local/download_and_untar.sh | https://wit3.fbk.eu/archive/2016-01/texts/en/de/en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt16/mt1/local/train_and_apply_bpe.sh | https://github.com/rsennrich/subword-nmt | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/data_prep_train.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset10m.incl_paracrawl.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://github.com/saffsd/langid.py | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jnas/tts1/run.shh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jsut/tts1/local/download.sh | https://github.com/r9y9/jsut-lab | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jvs/tts1/local/data_download.sh | https://drive.google.com/open?id=19oAw8wWn3Y7z6CKChRdAyGOB9yupL_Xt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://github.com/desh2608/kaldi-io-for-python.git@vbx | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&confirm= | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&id=1Piioxd5G_85K9Bhcr8ebdhXx0CnaHy7l | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/desh2608/dscore.git | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | https://desh2608.github.io/static/files/jsalt/plda | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/run.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/librispeech/asr1/run.sh | www.openslr.org/resources/12 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libritts/tts1/run.sh | www.openslr.org/resources/60 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/jsut/tts1/local/download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts1/local/ob_eval/evaluate_cer.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mboshi_french/st1/local/data_prep.sh | https://github.com/besacier/mboshi-french-parallel-corpus | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Mf2il_VelDIJMSio0bq7I8M9fSs-X4Ie | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=14d2ttsuEUFXsxx-KRWJMsFhQGrYOJcpH | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1acIBqcPVX5QXXXV9u8_yDPtCgfsdEJDV | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1qbK88SAKxqjMUybkMeIjrJWnNAZyE8V0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=11fNraDQs-LiODDxyV5ZW0Slf3XuDq5Cf | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1C5qK1FckA702nsYcXwmGdzlMmHg1F_ot | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1nbdYR5VqcTbLpOB-9cICKCgsLAs7fVzd | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Z3hSiP7fsR3kf8fjQYzIa07jmw4KXNnw | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/file/d/1UBPNwFEVhIZCOEpu4hTqPji57XRg85UO | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/${f} | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/releases/download/v0.5-beta/public_exclude_file_v5.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/files/3386441/exclude_df_youtube_1120.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ftp://ftp.espci.fr/pub/sigma/Features/${feat_dir}/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/TIMIT_training/TIMIT_Transcripts.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/WSJ05K_Test/WSJ0_5K_Transcripts.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vais1000/tts1/local/download.sh | https://drive.google.com/open?id=1HHhLuYhrkk3J6OJctZvgaSd0UgiaROwG | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/data_download.sh | https://github.com/nii-yamagishilab/VCC2020-database.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1UvtFkqdkE8bOCKWXlEltc746JsCKaTMX | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1XYpBZe9-9AgAxGpKfrgQPDjlW2S6duac | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1E6vzNaXT6r7Zybefat_p9ncnMOQCXtem | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11qOvuMGP76BEe_pcPgYdqnWi05MIIYTA | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1y6IFgLMatjh9wspwu-oBba-rPOH1zKlS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=16Q3XOAfI5tG0LZ0SIKE166N3RCtzO722 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1EET2qhBi6nl0DH7UEg0Ez9SfgDX-cFM- | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Vd4Qa8Dm9UQ-LZbyNPRiqoSgOZkGQsQi | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1bvyMfA-zKfO2LEogq-QXhHQeETxdBU29 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rA9ucA-VvhWkcFsGG6izBt2USOZY1_g6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1QfqwnTK0BKO0z_eYqltzL_MeqVGrMiZg | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kWBYSkvaQ0-7CwOfjVaWQYF0vEm0rNyS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=13xDOSo53BSQoF1kD27SdwXoAGqtjtIEM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11KKux-du6fvsMMB4jNk9YH23YUJjRcDV | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1li9DLZGnAheWZrB4oXGo0KWq-fHuFH_l | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/vivos/asr1/run.sh | https://ailab.hcmus.edu.vn/assets/vivos.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/setup.py | http://github.com/espnet/espnet | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/setup.py | shinjiw@ieee.org | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/chainer/chainer | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_chainer_ctc.shh | https://github.com/jheymann85/chainer_ctc.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_fairseq.sh | https://github.com/pytorch/fairseq.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_kenlm.sh | https://github.com/kpu/kenlm.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_pesq.sh | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/speech_tools.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/festival.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/espeak-ng/espeak-ng.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/numediart/MBROLA.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_py3mmseg.sh | https://github.com/kamo-naoyuki/py3mmseg | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_s3prl.sh | https://github.com/s3prl/s3prl.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_sctk.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_sph2pipe.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_sph2pipe.sh | https://sourceforge.net/projects/kaldi/files/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/sarulab-speech/tdmelodic_openjtalk.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/r9y9/pyopenjtalk.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc/releases/tag/v${warpctc_version} | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/installers/install_warp-transducer.sh | https://github.com/b-flo/warp-transducer.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/Makefile | https://github.com/kaldi-asr/kaldi | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/Makefile | https://github.com/moses-smt/mosesdecoder.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id=$1&export=download | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/gdown.pl | https://docs.google.com | 前置网址 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1dKzdaDpOkpx7kWZnvrvx2De7eZEdPHZs | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=11T9qw8rJlYzUdXvFjkjQjYrp3iGfQ15h | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1hiZn14ITUDM1nkn-GkaN_M3oaTOUcn1n | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=13DR-RB5wrbMqBGx_MC655VZlsEq52DyS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1xxAwPuUph23RnlC5gym7qDM02ZCW9Unp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1M_w7nxI6AfbtSHpMO-exILnAc_aUYvXP | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=17RUNFLP4SSTbGA01xWRJo7RkR876xM0i | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1zD-2GMrWM3thaDpS3h3rkTU4jIC0wc5B | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1W86YEQ6KbuUTIvVURLqKtSNqe_eI2GDN | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1iAXwC0AuWusa9AcFeUVkcNLG0I-hnSr3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1bTSygvonv5TS6-iuYsOIUWpN2atGnyhZ | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1T8thxkAxjGFPXPWPTcKLvHnd6lG0-82R | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1eA1VcRS9jzFa-DovyTgJLQ_jmwOLIi8L | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1sY7gEUg39QaO1szuN62-Llst9TrFno2t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1tv9GKyRT4CDsvUWKwH3s_OfXkiTi0gw7 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1jHUUmQFjWiQGyDd7ZeiCThSjjpbF_B4h | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=187xvyNbmJVZ0EZ1XHCdyjZHTXK9EcfkK | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1OwrUQzAmvjj1x9cDhnZPp6dqtsEqGEJM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1PsjFRV5eUP0HHwBaRYya9smKy5ghXKzj | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=10M6H88jEUGbRWBmU1Ff2VaTmOAeL8CEy | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder "${MDN_WAVENET_VOC_DIR}" | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1wFIAqxoBUioTKTLRLv29KzvphkUm3qdo | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_Dynamic_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1hawp5ZLw4_SIHIT3edglxbKIIkPVe8n3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/synth_wav.sh | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://github.com/espnet/espnet#tts-demo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://colab.research.google.com/github/espnet/notebook/blob/master/tts_realtime_demo.ipynb | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet_Dynamic_for_PyTorch/utils/spm_train | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet_Dynamic_for_PyTorch/utils/spm_encode | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet_Dynamic_for_PyTorch/utils/spm_decode | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/pack_model.sh | ESPnet_Dynamic_for_PyTorch/utils/pack_model.sh | shinjiw@ieee.org | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/json2trn.py | ESPnet_Dynamic_for_PyTorch/utils/json2trn_wo_dict.py | https://github.com/espnet/espnet/issues/993 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/json2trn.py | ESPnet_Dynamic_for_PyTorch/utils/json2trn.py | https://github.com/espnet/espnet/issues/993 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet_Dynamic_for_PyTorch/utils/generate_wav_from_fbank.py | https://github.com/kan-bayashi/PytorchWaveNetVocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/generate_wav_from_fbank.py | ESPnet_Dynamic_for_PyTorch/utils/generate_wav_from_fbank.py | https://ieeexplore.ieee.org/abstract/document/8461332 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/gdown.pl | ESPnet_Dynamic_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id= | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_Dynamic_for_PyTorch/utils/eval-source-separation.py | https://ieeexplore.ieee.org/document/5495701 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_Dynamic_for_PyTorch/utils/eval-source-separation.py | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_Dynamic_for_PyTorch/utils/eval-source-separation.py | https://ieeexplore.ieee.org/document/941023 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_Dynamic_for_PyTorch/utils/eval-source-separation.py | https://arxiv.org/abs/1804.06267 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet_Dynamic_for_PyTorch/utils/download_from_google_drive.sh | https://drive.google.com/open?id=1zF88bRNbJhw9hNBq3NrDg8vnGGibREmg | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet_Dynamic_for_PyTorch/utils/download_from_google_drive.sh | https://qiita.com/namakemono/items/c963e75e0af3f7eed732 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet_Dynamic_for_PyTorch/utils/download_from_google_drive.sh | https://github.com/wkentaro/gdown | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_warp-ctc.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc/releases/tag/v | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/anaconda/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/nvidia/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/conda-forge/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/pytorch/pytorch/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightl | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_chainer_ctc.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_chainer_ctc.sh | https://github.com/jheymann85/chainer_ctc.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_chainer.sh | ESPnet_Dynamic_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/pypa/setuptools/issues/855 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/vc/pytorch_backend/vc.py | https://arxiv.org/abs/1905.09263 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_Dynamic_for_PyTorch/espnet/utils/spec_augment.py | https://github.com/zcaceres/spec_augment | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_Dynamic_for_PyTorch/espnet/utils/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_Dynamic_for_PyTorch/espnet/utils/spec_augment.py | https://en.wikipedia.org/wiki/Polyharmonic_spline | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/deterministic_utils.py | ESPnet_Dynamic_for_PyTorch/espnet/utils/deterministic_utils.py | https://github.com/pytorch/pytorch/issues/6351 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/tts/pytorch_backend/tts.py | https://arxiv.org/abs/1905.09263 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_Dynamic_for_PyTorch/espnet/transform/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/transform/perturb.py | ESPnet_Dynamic_for_PyTorch/espnet/transform/perturb.py | https://groups.google.com/forum/#!topic/kaldi-help/8OOG7eE4sZ8 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/transform/perturb.py | ESPnet_Dynamic_for_PyTorch/espnet/transform/perturb.py | http://spandh.dcs.shef.ac.uk/chime_workshop/papers/CHiME_2018_paper_kanda.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_Dynamic_for_PyTorch/espnet/st/pytorch_backend/st.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet_Dynamic_for_PyTorch/espnet/scheduler/scheduler.py | https://openreview.net/pdf?id=BJYwwY9ll | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet_Dynamic_for_PyTorch/espnet/scheduler/scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet_Dynamic_for_PyTorch/espnet/scheduler/scheduler.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/scorers/ctc.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/wavenet.py | https://github.com/kan-bayashi/PytorchWaveNetVocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/wavenet.py | https://arxiv.org/abs/1611.09482 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/multi_layer_conv.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/lightconv2d.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/lightconv.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/encoder.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/encoder.py | https://github.com/espnet/espnet/commit/21d70286c354c66c0350e65dc098d2ee236faccc#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/decoder.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/encoder.py | https://github.com/espnet/espnet/commit/3d422f6de8d4f03673b89e1caef698745ec749ea#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1809.08895 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/dynamic_conv.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/decoder.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/decoder.py | https://github.com/espnet/espnet/commit/3d422f6de8d4f03673b89e1caef698745ec749ea#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/argument.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/argument.py | https://arxiv.org/abs/1912.11793v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/argument.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transformer/argument.py | https://arxiv.org/abs/1901.10430 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transducer/arguments.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transducer/arguments.py | https://arxiv.org/abs/2010.11148 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/encoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/decoder.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1606.01305 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/decoder.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://github.com/eladhoffer/seq2seq.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1710.07654 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://arxiv.org/abs/1703.10135 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://github.com/pytorch/pytorch/pull/6327 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/decoders.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/rnn/decoders.py | https://arxiv.org/pdf/1409.2329.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1710.07654 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1704.04368 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1807.06736.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://github.com/pytorch/examples/blob/4581968193699de14b56527296262dd76ab43557/word_language_model/model.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://arxiv.org/abs/1608.05859 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://arxiv.org/abs/1611.01462 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/default.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/lm/default.py | https://github.com/espnet/espnet/issues/1075 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/gtn_ctc.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/gtn_ctc.py | https://github.com/facebookresearch/gtn_applications/blob/master/utils.py#L251 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/dnn_beamformer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/frontends/dnn_beamformer.py | https://arxiv.org/abs/1703.04783 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/beamformer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/frontends/beamformer.py | https://ieeexplore.ieee.org/document/5089420 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/fastspeech/duration_predictor.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_vc_transformer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_vc_transformer.py | https://arxiv.org/pdf/1912.06813.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_transformer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_fastspeech.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_st_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_st.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | https://arxiv.org/pdf/1811.04903.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_asr_mix_transformer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mix_transformer.py | https://arxiv.org/pdf/2002.03921.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mix.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/asr_recog.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_maskctc.py | https://arxiv.org/abs/2005.08700 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/ctc.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/ctc.py | https://github.com/pytorch/pytorch/issues/17798 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/conformer/argument.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/chainer_backend/rnn/training.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/chainer_backend/transformer/training.py | https://github.com/chainer/chainer/blob/master/chainer/optimizer.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/chainer_backend/rnn/training.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/chainer_backend/rnn/training.py | https://github.com/chainer/chainer/blob/master/chainer/optimizer.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9053040 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/2002.03577.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9250505 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/batch_beam_search_online_sim.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/batch_beam_search_online.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search.py | ESPnet_Dynamic_for_PyTorch/espnet/nets/batch_beam_search.py | https://github.com/espnet/espnet/pull/1402#discussion_r354561029 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_Dynamic_for_PyTorch/espnet/mt/pytorch_backend/mt.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_Dynamic_for_PyTorch/espnet/lm/pytorch_backend/lm.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_Dynamic_for_PyTorch/espnet/lm/pytorch_backend/lm.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_Dynamic_for_PyTorch/espnet/lm/lm_utils.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/lm_utils.py | ESPnet_Dynamic_for_PyTorch/espnet/lm/lm_utils.py | http://docs.h5py.org/en/stable/special.html#arbitrary-vlen-data | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_Dynamic_for_PyTorch/espnet/lm/chainer_backend/lm.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/st_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/st_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/mt_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/mt_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/lm_train.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/lm_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/asr_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/asr_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/asr_recog.py | ESPnet_Dynamic_for_PyTorch/espnet/bin/asr_recog.py | https://arxiv.org/abs/2005.08700 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/recog.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/recog.py | https://github.com/espnet/espnet/pull/3616 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/asr_mix.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/pull/1388 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/issues/777 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/pull/2171 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_Dynamic_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/pytorch/pytorch/issues/27963 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet_Dynamic_for_PyTorch/egs/yoloxochitl_mixtec/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/yoloxochitl_mixtec/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/yesno/asr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/yesno/tts1/run.sh | http://sourceforge.net/projects/kaldi/files/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/yesno/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/yesno/asr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/yesno/asr1/run.sh | http://sourceforge.net/projects/kaldi/files/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/yesno/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/wsj_mix/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vivos/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/voc1/local/make_subset_data.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/voc1/local/make_subset_data.sh | https://opensource.org/licenses/MIT | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/voc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_mandarin.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_german.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_finnish.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task2/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://github.com/espnet/espnet/blob/master/egs/librispeech/asr1/RESULTS.md#pytorch-large-transformer-with-specaug-4-gpus--large-lstm-lm | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/local/clean_text_asr_result.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/local/clean_text_mailabs.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/local/clean_text_css10.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_de/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/vais1000/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tweb/README.md | ESPnet_Dynamic_for_PyTorch/egs/tweb/tts1/run.sh | https://www.kaggle.com/bryanpark/the-world-english-bible-speech-dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/tweb/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/timit_ssc/ssr1/run.sh | ftp://ftp.espci.fr/pub/sigma/Features/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ftp://ftp.espci.fr/pub/sigma/Features/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/timit_ssc/ssr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/timit/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/tedlium3/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_fix_speakerid.pl | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/swbd1_fix_speakerid.pl | pengqi@cs.stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | ESPnet_Dynamic_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/ru_open_stt/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_SimData_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_RealData_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/prog/score_sim_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/prog/score_real_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.org/tools/taskFiles_et.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/RESULTS | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/RESULTS | https://reverb2014.dereverberation.com/result_asr.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/prepare_simu_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/prepare_real_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/Generate_mcTrainData_cut.m | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/Generate_mcTrainData_cut.m | http://stevem.us/fconv.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_Dynamic_for_PyTorch/egs/polyphone_swiss_french/asr1/local/data_prep.py | http://catalog.elra.info/en-us/repository/browse/ELRA-S0030_02 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/polyphone_swiss_french/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/must_c_v2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/must_c_v2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/must_c_v2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/must_c/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/must_c/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/must_c/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mtedx/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mtedx/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mini_an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/mgb2/asr1/run.sh | https://arabicspeech.org/mgb2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/local/xml2stm.py | ESPnet_Dynamic_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/mgb2/asr1/local/mgb_extract_data.sh | https://arabicspeech.org/mgb2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mgb2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/mboshi_french/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/m_ailabs/tts1/local/download.sh | ESPnet_Dynamic_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/m_ailabs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts2/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts2/RESULTS | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts2/RESULTS | https://drive.google.com/drive/folders/1AMAKY8uQY59-DL5KNjrPSoosu4sftJPn?usp=sharing | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts2/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts1/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/local/data_download.sh | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts1/local/clean_text.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/asr1/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/ljspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/libritts/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/librispeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/st1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/local/data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/st1/local/data_prep.sh | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/prepare-raw.sh | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/mt1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/asr1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_trans/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/run.sh | https://github.com/espnet/espnet/blob/master/egs/librispeech/asr1/RESULTS.md#pytorch-large-transformer-with-specaug-4-gpus--transformer-lm-4-gpus | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/wer_output_filter | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/segmentation/apply_webrtcvad.py | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/segmentation/apply_webrtcvad.py | https://github.com/wiseman/py-webrtcvad/blob/master/example.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/download_asr.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/diarize.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/nryant/dscore | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://arxiv.org/abs/1910.08847 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | http://www.fit.vutbr.cz/research/groups/speech/publi/2019/diez_IEEE_ACM_2019_08910412.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.py | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.py | https://github.com/BUTSpeechFIT/VBx | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/VB_diarization.py | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/diarization/VB_diarization.py | burget@fit.vutbr. | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/li42/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/li10/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_Dynamic_for_PyTorch/egs/ksponspeech/asr1/run.sh | https://aihub.or.kr/aidata/105 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet_Dynamic_for_PyTorch/egs/ksponspeech/asr1/local/get_space_normalized_hyps.py | https://github.com/kaldi-asr/kaldi/blob/master/src/bin/align-text.cc | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/ksponspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jvs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jsut/tts1/local/prep_segments.py | ESPnet_Dynamic_for_PyTorch/egs/jsut/tts1/local/prep_segments.py | https://kaldi-asr.org/doc/data_prep.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jsut/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jsut/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jsalt18e2e/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet_Dynamic_for_PyTorch/egs/jnas/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jnas/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jnas/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/jesc/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_Dynamic_for_PyTorch/egs/iwslt21_low_resource/st1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt21_low_resource/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_Dynamic_for_PyTorch/egs/iwslt21_low_resource/asr1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt21_low_resource/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/punc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt19/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt19/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/segmented/IWSLT-SLT.segmented.tst2019.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/segmented/IWSLT-SLT.segmented.tst2020.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/data_prep_train.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/data_prep_eval.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt18/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/iwslt16/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/hub4_spanish/asr1/local/write_kaldi_files.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_training_text.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_test_text.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_data.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/hub4_spanish/asr1/local/parse_sgm.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/hub4_spanish/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/how2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/how2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/how2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/hkust/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_callhome_spanish/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_callhome_spanish/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_callhome_spanish/asr1b/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/fisher_callhome_spanish/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/readsph.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/readsph.m | http://www.ee.ic.ac.uk/hp/staff/dmb/voicebox/voicebox.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/readsph.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/readsph.m | http://www.gnu.org/copyleft/gpl.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/read_sphere.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/linear_shift.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/find_files.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | https://www.ldc.upenn.edu/language-resources/tools/sphere-conversion-tools | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/tools/create_folder_str.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/dipco/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_Dynamic_for_PyTorch/egs/csmsc/tts1/local/data_download.sh | https://www.data-baker.com/open_source.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/csmsc/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/csj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/csj/align1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/local/process_tsv.py | ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/local/process_tsv.py | https://github.com/facebookresearch/covost/blob/master/get_covost_splits.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet_Dynamic_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-3/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet_Dynamic_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet_Dynamic_for_PyTorch/egs/commonvoice/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/commonvoice/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_indic/tts1/local/data_download.sh | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/wer_output_filter | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://spandh.dcs.shef.ac.uk/chime_challenge/data.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://spandh.dcs.shef.ac.uk/chime_workshop/papers/CHiME_2018_paper_boeddecker.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/install_pb_chime5.sh | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://stackoverflow.com/a/3796947/5766934 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/generate_chime6_data.sh | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/check_tools.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/chime5/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/chime4/asr1_multich/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/chime4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/blizzard17/tts1/run.sh | ESPnet_Dynamic_for_PyTorch/egs/blizzard17/tts1/run.sh | http://www.cstr.ed.ac.uk/projects/blizzard/2017/usborne_blizzard2017/license.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/blizzard17/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/babel/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/path.sh | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/path.sh | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/arctic/vc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_arctic/tts1/local/data_download.sh | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/an4/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_xml2text.sh | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_beamform.sh | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_beamform.sh | http://groups.inf.ed.ac.uk/ami/corpus/dataproblems.shtml | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/aishell2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/aishell/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_Dynamic_for_PyTorch/egs/aidatatang_200zh/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/devel/cudnn7/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/blob/master/dist/11.1.1/ubuntu20.04-x86_64/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/blob/master/dist/11.1.1/ubuntu20.04-x86_64/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/issues/88 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/runtime/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/issues/88 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af82.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/aidatatang_200zh/asr1/run.sh | www.openslr.org/resources/62 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/aishell/asr1/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/$annots.gzip | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict $dir/cmudict | 相关设置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/csmsc/tts1/local/data_download.sh | https://www.data-baker.com/open_source.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt21_low_resource/asr1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/iwslt21_low_resource/st1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/jnas/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/librispeech/asr1/run.sh | www.openslr.org/resources/12 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/libritts/tts1/run.sh | www.openslr.org/resources/60 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/${lang}.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mgb2/asr1/local/mgb_extract_data.sh | https://arabicspeech.org/mgb2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 Sound-Files-Puebla-Nahuatl.tgz.part0 9 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz SpeechTranslation_Nahuatl_Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 Sound-Files-Puebla-Nahuatl.tgz.part0 9 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.2.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.org/tools/taskFiles_et.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/${f} | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/espnet/nets/pytorch_backend/transducer/arguments.py | https://arxiv.org/abs/2010.11148 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/setup.py | shinjiw@ieee.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2 | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/utils/pack_model.sh | shinjiw@ieee.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_Dynamic_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/audio/ESPnet_for_PyTorch/public_address_statement.md b/PyTorch/built-in/audio/ESPnet_for_PyTorch/public_address_statement.md index 545a13cdeeffd03cf5cdabe2eb581be52e4877d4..f0c415ac0cfff368a979b5b9bab11cbd98dfeb7a 100644 --- a/PyTorch/built-in/audio/ESPnet_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/audio/ESPnet_for_PyTorch/public_address_statement.md @@ -1,685 +1,167 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------------|-----------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/Dockerfile | https://github.com/espnet/espnet | 下载源码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af80.pub | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/Dockerfile | https://github.com/cybertronai/pytorch-lamb | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/kaldi-asr/kaldi | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://github.com/espnet/kaldi-bin/releases/download/v0.0.1/ubuntu16-featbin.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/aidatatang_200zh/asr1/run.sh | www.openslr.org/resources/62 | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/aishell/asr1/run.sh | www.openslr.org/resources/33 | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/Creative-Commons-Attribution-NonCommercial-ShareAlike-2.5.txt | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://sourceforge.net/projects/nite/files/nite/nxt_1.4.4/nxt_1.4.4.zip | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/$annots.gzip | 下载配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/vc1/local/ob_eval/evaluate.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mPf-BxX3t_pqFFV6MGPBRePm5kgNR5sM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1fRLw6EA0x55xa449i_YRjCgm8sgv3hJI | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1v70TtwfmYtTHq9LvksX907mNTEv1G-J1 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/arctic/vc1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1ty_de85SNldzVJSMQrHwl1ASBdGdSRav | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://github.com/fgnt/pb_chime5.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iXdQv_YGD9VG1dR_xCjSkX6A4HkrpTbF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1iOwvCx6wX5_qCmHZSX_vCd_ZYn-B5akh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rHQMMjkSoiX3JX2e70MKUKSrxHGwhmRb | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1cNrTa8Jxa3AYcap7jo0_RPBapiay3etG | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1zv9GwhhBW32a6RM5wHzjqRxkkv9IrXTL | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/csj/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/fisher_callhome_spanish/st1/local/normalize_trans.sh | https://github.com/joshua-decoder/fisher-callhome-corpus.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/how2/st1/local/data_prep_test.sh | https://islpc21.is.cs.cmu.edu/ramons/iwslt2019.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt16/mt1/local/download_and_untar.sh | https://wit3.fbk.eu/archive/2016-01/texts/en/de/en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt16/mt1/local/train_and_apply_bpe.sh | https://github.com/rsennrich/subword-nmt | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/data_prep_train.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset10m.incl_paracrawl.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://github.com/saffsd/langid.py | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jnas/tts1/run.shh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jsut/tts1/local/download.sh | https://github.com/r9y9/jsut-lab | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jvs/tts1/local/data_download.sh | https://drive.google.com/open?id=19oAw8wWn3Y7z6CKChRdAyGOB9yupL_Xt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jvs/tts1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://github.com/desh2608/kaldi-io-for-python.git@vbx | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&confirm= | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/local/data_download.sh | https://docs.google.com/uc?export=download&id=1Piioxd5G_85K9Bhcr8ebdhXx0CnaHy7l | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/desh2608/dscore.git | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | https://desh2608.github.io/static/files/jsalt/plda | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libri_css/asr1/run.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/librispeech/asr1/run.sh | www.openslr.org/resources/12 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libritts/tts1/run.sh | www.openslr.org/resources/60 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/jsut/tts1/local/download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ljspeech/tts1/local/ob_eval/evaluate_cer.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mboshi_french/st1/local/data_prep.sh | https://github.com/besacier/mboshi-french-parallel-corpus | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_test.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Mf2il_VelDIJMSio0bq7I8M9fSs-X4Ie | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=14d2ttsuEUFXsxx-KRWJMsFhQGrYOJcpH | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1acIBqcPVX5QXXXV9u8_yDPtCgfsdEJDV | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1qbK88SAKxqjMUybkMeIjrJWnNAZyE8V0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=11fNraDQs-LiODDxyV5ZW0Slf3XuDq5Cf | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1C5qK1FckA702nsYcXwmGdzlMmHg1F_ot | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1nbdYR5VqcTbLpOB-9cICKCgsLAs7fVzd | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/open?id=1Z3hSiP7fsR3kf8fjQYzIa07jmw4KXNnw | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/must_c/st1/local/download_and_untar.sh | https://drive.google.com/file/d/1UBPNwFEVhIZCOEpu4hTqPji57XRg85UO | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.0.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200102-I!!SOFT-ZST-E&type=items | 下载密钥 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://github.com/MuSAELab/SRMRToolbox.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/${f} | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/releases/download/v0.5-beta/public_exclude_file_v5.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://github.com/snakers4/open_stt/files/3386441/exclude_df_youtube_1120.zip | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_model.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ftp://ftp.espci.fr/pub/sigma/Features/${feat_dir}/ | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/TIMIT_training/TIMIT_Transcripts.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | https://ftp.espci.fr/pub/sigma/WSJ05K_Test/WSJ0_5K_Transcripts.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vais1000/tts1/local/download.sh | https://drive.google.com/open?id=1HHhLuYhrkk3J6OJctZvgaSd0UgiaROwG | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/data_download.sh | https://github.com/nii-yamagishilab/VCC2020-database.git | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1UvtFkqdkE8bOCKWXlEltc746JsCKaTMX | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1XYpBZe9-9AgAxGpKfrgQPDjlW2S6duac | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1E6vzNaXT6r7Zybefat_p9ncnMOQCXtem | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11qOvuMGP76BEe_pcPgYdqnWi05MIIYTA | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1y6IFgLMatjh9wspwu-oBba-rPOH1zKlS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=16Q3XOAfI5tG0LZ0SIKE166N3RCtzO722 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1EET2qhBi6nl0DH7UEg0Ez9SfgDX-cFM- | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1Vd4Qa8Dm9UQ-LZbyNPRiqoSgOZkGQsQi | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1bvyMfA-zKfO2LEogq-QXhHQeETxdBU29 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1rA9ucA-VvhWkcFsGG6izBt2USOZY1_g6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1QfqwnTK0BKO0z_eYqltzL_MeqVGrMiZg | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1kWBYSkvaQ0-7CwOfjVaWQYF0vEm0rNyS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=13xDOSo53BSQoF1kD27SdwXoAGqtjtIEM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=11KKux-du6fvsMMB4jNk9YH23YUJjRcDV | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/pretrained_model_download.sh | https://drive.google.com/open?id=1li9DLZGnAheWZrB4oXGo0KWq-fHuFH_l | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/vivos/asr1/run.sh | https://ailab.hcmus.edu.vn/assets/vivos.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/setup.py | http://github.com/espnet/espnet | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/setup.py | shinjiw@ieee.org | 邮箱 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/chainer/chainer | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_chainer_ctc.shh | https://github.com/jheymann85/chainer_ctc.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_fairseq.sh | https://github.com/pytorch/fairseq.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_kenlm.sh | https://github.com/kpu/kenlm.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_pesq.sh | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/speech_tools.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/festvox/festival.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/espeak-ng/espeak-ng.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_phonemizer.sh | https://github.com/numediart/MBROLA.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_py3mmseg.sh | https://github.com/kamo-naoyuki/py3mmseg | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_s3prl.sh | https://github.com/s3prl/s3prl.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_sctk.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2 | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_sph2pipe.sh | https://github.com/espnet/kaldi-bin/releases/download/v0.0.2/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_sph2pipe.sh | https://sourceforge.net/projects/kaldi/files/sph2pipe_v2.5.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/sarulab-speech/tdmelodic_openjtalk.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_tdmelodic_pyopenjtalk.sh | https://github.com/r9y9/pyopenjtalk.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 下载第三方库 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc/releases/tag/v${warpctc_version} | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/installers/install_warp-transducer.sh | https://github.com/b-flo/warp-transducer.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/Makefile | https://github.com/kaldi-asr/kaldi | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/Makefile | https://github.com/moses-smt/mosesdecoder.git | 下载代码 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载工具脚本 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1ALvD4nHan9VDJlYJwNurVr7H7OV0j2X9 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1Az-4H25uwnEFa4lENc-EKiPaWXaijcJp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/asr_align_wav.sh | https://drive.google.com/open?id=1jdEKbgWhLTxN_qP4xwE7mTOPmp7Ga--T | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id=$1&export=download | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/gdown.pl | https://docs.google.com | 前置网址 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1UqIY6WJMZ4sxNxSugUqp3mrGb3j6h7xe | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cac5Uc09lJrCYfWkLQsF8eapQcxZnYdf | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1cVeSOYY1twOfL9Gns7Z3ZDnkrJqNwPow | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1zcPglHAKILwVgfACoMWWERiyIquzSYuU | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1BtQvAnsFvVi-dp_qsaFP7n4A_5cwnlR6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=1tWccl6aYU67kbtkm8jv5H6xayqg1rzjh | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/recog_wav.sh | https://drive.google.com/open?id=120nUQcSsKeY5dpyMWw_kI33ooMRGT2uF | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1dKzdaDpOkpx7kWZnvrvx2De7eZEdPHZs | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=11T9qw8rJlYzUdXvFjkjQjYrp3iGfQ15h | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1hiZn14ITUDM1nkn-GkaN_M3oaTOUcn1n | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=13DR-RB5wrbMqBGx_MC655VZlsEq52DyS | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1xxAwPuUph23RnlC5gym7qDM02ZCW9Unp | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1M_w7nxI6AfbtSHpMO-exILnAc_aUYvXP | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=17RUNFLP4SSTbGA01xWRJo7RkR876xM0i | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1zD-2GMrWM3thaDpS3h3rkTU4jIC0wc5B | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1W86YEQ6KbuUTIvVURLqKtSNqe_eI2GDN | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1iAXwC0AuWusa9AcFeUVkcNLG0I-hnSr3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1Xj73mDPuuPH8GsyNO8GnOC3mn0_OK4g3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1mEnZfBKqA4eT6Bn0eRZuP6lNzL-IL3VD | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1kp5M4VvmagDmYckFJa78WGqh1drb_P9t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1bTSygvonv5TS6-iuYsOIUWpN2atGnyhZ | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1T8thxkAxjGFPXPWPTcKLvHnd6lG0-82R | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1eA1VcRS9jzFa-DovyTgJLQ_jmwOLIi8L | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1sY7gEUg39QaO1szuN62-Llst9TrFno2t | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1tv9GKyRT4CDsvUWKwH3s_OfXkiTi0gw7 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1jHUUmQFjWiQGyDd7ZeiCThSjjpbF_B4h | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=187xvyNbmJVZ0EZ1XHCdyjZHTXK9EcfkK | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1OwrUQzAmvjj1x9cDhnZPp6dqtsEqGEJM | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=1PsjFRV5eUP0HHwBaRYya9smKy5ghXKzj | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://drive.google.com/open?id=10M6H88jEUGbRWBmU1Ff2VaTmOAeL8CEy | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder "${MDN_WAVENET_VOC_DIR}" | 下载依赖 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1wFIAqxoBUioTKTLRLv29KzvphkUm3qdo | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | ESPnet_for_PyTorch/utils/translate_wav.sh | https://drive.google.com/open?id=1hawp5ZLw4_SIHIT3edglxbKIIkPVe8n3 | 下载预训练模型 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/synth_wav.sh | ESPnet_for_PyTorch/utils/synth_wav.sh | https://github.com/espnet/espnet#tts-demo | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet_for_PyTorch/utils/synth_wav.sh | https://colab.research.google.com/github/espnet/notebook/blob/master/tts_realtime_demo.ipynb | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/README.md | ESPnet_for_PyTorch/utils/synth_wav.sh | https://github.com/r9y9/wavenet_vocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet_for_PyTorch/utils/spm_train | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet_for_PyTorch/utils/spm_encode | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/spm_encode | ESPnet_for_PyTorch/utils/spm_decode | https://github.com/pytorch/fairseq/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/pack_model.sh | ESPnet_for_PyTorch/utils/pack_model.sh | shinjiw@ieee.org | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/json2trn.py | ESPnet_for_PyTorch/utils/json2trn_wo_dict.py | https://github.com/espnet/espnet/issues/993 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/json2trn.py | ESPnet_for_PyTorch/utils/json2trn.py | https://github.com/espnet/espnet/issues/993 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet_for_PyTorch/utils/generate_wav_from_fbank.py | https://github.com/kan-bayashi/PytorchWaveNetVocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/generate_wav_from_fbank.py | ESPnet_for_PyTorch/utils/generate_wav_from_fbank.py | https://ieeexplore.ieee.org/abstract/document/8461332 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/gdown.pl | ESPnet_for_PyTorch/utils/gdown.pl | https://docs.google.com/uc?id= | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_for_PyTorch/utils/eval-source-separation.py | https://ieeexplore.ieee.org/document/5495701 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_for_PyTorch/utils/eval-source-separation.py | http://www.itu.int/rec/dologin_pub.asp?lang=e&id=T-REC-P.862-200511-I!Amd2!SOFT-ZST-E&type=items | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_for_PyTorch/utils/eval-source-separation.py | https://ieeexplore.ieee.org/document/941023 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/eval-source-separation.py | ESPnet_for_PyTorch/utils/eval-source-separation.py | https://arxiv.org/abs/1804.06267 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet_for_PyTorch/utils/download_from_google_drive.sh | https://drive.google.com/open?id=1zF88bRNbJhw9hNBq3NrDg8vnGGibREmg | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet_for_PyTorch/utils/download_from_google_drive.sh | https://qiita.com/namakemono/items/c963e75e0af3f7eed732 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/utils/download_from_google_drive.sh | ESPnet_for_PyTorch/utils/download_from_google_drive.sh | https://github.com/wkentaro/gdown | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_warp-ctc.sh | ESPnet_for_PyTorch/tools/installers/install_warp-ctc.sh | https://github.com/espnet/warp-ctc/releases/tag/v | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/anaconda/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/nvidia/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/conda-forge/cudatoolkit/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_torch.sh | ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://anaconda.org/pytorch/pytorch/files | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_k2.sh | ESPnet_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightl | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_chainer_ctc.sh | ESPnet_for_PyTorch/tools/installers/install_chainer_ctc.sh | https://github.com/jheymann85/chainer_ctc.git | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/tools/installers/install_chainer.sh | ESPnet_for_PyTorch/tools/installers/install_chainer.sh | https://github.com/pypa/setuptools/issues/855 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/vc/pytorch_backend/vc.py | https://arxiv.org/abs/1905.09263 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_for_PyTorch/espnet/utils/spec_augment.py | https://github.com/zcaceres/spec_augment | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_for_PyTorch/espnet/utils/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_for_PyTorch/espnet/utils/spec_augment.py | https://en.wikipedia.org/wiki/Polyharmonic_spline | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/deterministic_utils.py | ESPnet_for_PyTorch/espnet/utils/deterministic_utils.py | https://github.com/pytorch/pytorch/issues/6351 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/tts/pytorch_backend/tts.py | https://arxiv.org/abs/1905.09263 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/utils/spec_augment.py | ESPnet_for_PyTorch/espnet/transform/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/transform/perturb.py | ESPnet_for_PyTorch/espnet/transform/perturb.py | https://groups.google.com/forum/#!topic/kaldi-help/8OOG7eE4sZ8 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/transform/perturb.py | ESPnet_for_PyTorch/espnet/transform/perturb.py | http://spandh.dcs.shef.ac.uk/chime_workshop/papers/CHiME_2018_paper_kanda.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_for_PyTorch/espnet/st/pytorch_backend/st.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet_for_PyTorch/espnet/scheduler/scheduler.py | https://openreview.net/pdf?id=BJYwwY9ll | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet_for_PyTorch/espnet/scheduler/scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/scheduler/scheduler.py | ESPnet_for_PyTorch/espnet/scheduler/scheduler.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet_for_PyTorch/espnet/nets/scorers/ctc.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/wavenet.py | https://github.com/kan-bayashi/PytorchWaveNetVocoder | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/wavenet.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/wavenet.py | https://arxiv.org/abs/1611.09482 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/multi_layer_conv.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/lightconv2d.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/lightconv.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/encoder.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/encoder.py | https://github.com/espnet/espnet/commit/21d70286c354c66c0350e65dc098d2ee236faccc#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/decoder.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/encoder.py | https://github.com/espnet/espnet/commit/3d422f6de8d4f03673b89e1caef698745ec749ea#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1809.08895 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/embedding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/dynamic_conv2d.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/dynamic_conv.py | https://github.com/pytorch/fairseq/tree/master/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/decoder.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/decoder.py | https://github.com/espnet/espnet/commit/3d422f6de8d4f03673b89e1caef698745ec749ea#diff-bffb1396f038b317b2b64dd96e6d3563 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/argument.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/argument.py | https://arxiv.org/abs/1912.11793v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/argument.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transformer/argument.py | https://arxiv.org/abs/1901.10430 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transducer/arguments.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transducer/arguments.py | https://arxiv.org/abs/2010.11148 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/encoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/decoder.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1606.01305 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/decoder.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://github.com/eladhoffer/seq2seq.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/decoder.py | https://arxiv.org/abs/1710.07654 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://arxiv.org/abs/1703.10135 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://github.com/pytorch/pytorch/pull/6327 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/tacotron2/cbhg.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/tacotron2/cbhg.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/decoders.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/rnn/decoders.py | https://arxiv.org/pdf/1409.2329.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1710.07654 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1704.04368 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/rnn/attentions.py | https://arxiv.org/pdf/1807.06736.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://github.com/pytorch/examples/blob/4581968193699de14b56527296262dd76ab43557/word_language_model/model.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://arxiv.org/abs/1608.05859 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/seq_rnn.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/lm/seq_rnn.py | https://arxiv.org/abs/1611.01462 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/lm/default.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/lm/default.py | https://github.com/espnet/espnet/issues/1075 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/gtn_ctc.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/gtn_ctc.py | https://github.com/facebookresearch/gtn_applications/blob/master/utils.py#L251 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/dnn_beamformer.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/frontends/dnn_beamformer.py | https://arxiv.org/abs/1703.04783 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/frontends/beamformer.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/frontends/beamformer.py | https://ieeexplore.ieee.org/document/5089420 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/fastspeech/duration_predictor.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_vc_transformer.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_vc_transformer.py | https://arxiv.org/pdf/1912.06813.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_transformer.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_tacotron2.py | https://arxiv.org/abs/1712.05884 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/fastspeech/length_regulator.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_tts_fastspeech.py | https://arxiv.org/pdf/1905.09263.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_st_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_st.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | https://arxiv.org/pdf/1811.04903.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mulenc.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_asr_mix_transformer.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mix_transformer.py | https://arxiv.org/pdf/2002.03921.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_mix.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/asr_recog.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_maskctc.py | https://arxiv.org/abs/2005.08700 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/e2e_st.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/e2e_asr.py | https://discuss.pytorch.org/t/set-forget-gate-bias-of-lstm/1745 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/ctc.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/ctc.py | https://github.com/pytorch/pytorch/issues/17798 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/transformer/attention.py | ESPnet_for_PyTorch/espnet/nets/pytorch_backend/conformer/argument.py | https://github.com/espnet/espnet/pull/2816 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/chainer_backend/rnn/training.py | ESPnet_for_PyTorch/espnet/nets/chainer_backend/transformer/training.py | https://github.com/chainer/chainer/blob/master/chainer/optimizer.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/chainer_backend/rnn/training.py | ESPnet_for_PyTorch/espnet/nets/chainer_backend/rnn/training.py | https://github.com/chainer/chainer/blob/master/chainer/optimizer.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9053040 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_for_PyTorch/espnet/nets/beam_search_transducer.py | https://arxiv.org/pdf/2002.03577.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/beam_search_transducer.py | ESPnet_for_PyTorch/espnet/nets/beam_search_transducer.py | https://ieeexplore.ieee.org/document/9250505 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet_for_PyTorch/espnet/nets/batch_beam_search_online_sim.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search_online.py | ESPnet_for_PyTorch/espnet/nets/batch_beam_search_online.py | https://arxiv.org/abs/2006.14941 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/batch_beam_search.py | ESPnet_for_PyTorch/espnet/nets/batch_beam_search.py | https://github.com/espnet/espnet/pull/1402#discussion_r354561029 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_for_PyTorch/espnet/mt/pytorch_backend/mt.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_for_PyTorch/espnet/lm/pytorch_backend/lm.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_for_PyTorch/espnet/lm/pytorch_backend/lm.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_for_PyTorch/espnet/lm/lm_utils.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/lm_utils.py | ESPnet_for_PyTorch/espnet/lm/lm_utils.py | http://docs.h5py.org/en/stable/special.html#arbitrary-vlen-data | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_for_PyTorch/espnet/lm/chainer_backend/lm.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/st_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/st_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/mt_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/mt_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/lm/pytorch_backend/lm.py | ESPnet_for_PyTorch/espnet/bin/lm_train.py | https://github.com/chainer/chainer/blob/master/examples/ptb/train_ptb_custom_loop.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/lm_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/asr_train.py | https://nvidia.github.io/apex/amp.html#opt-levels | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/st_train.py | ESPnet_for_PyTorch/espnet/bin/asr_train.py | https://github.com/pytorch/pytorch/issues/21108 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/bin/asr_recog.py | ESPnet_for_PyTorch/espnet/bin/asr_recog.py | https://arxiv.org/abs/2005.08700 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/recog.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/recog.py | https://github.com/espnet/espnet/pull/3616 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/asr_mix.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/pull/1388 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/issues/777 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/mt/pytorch_backend/mt.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/espnet/espnet/pull/2171 | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/asr/pytorch_backend/asr.py | ESPnet_for_PyTorch/espnet/asr/pytorch_backend/asr.py | https://github.com/pytorch/pytorch/issues/27963 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet_for_PyTorch/egs/yoloxochitl_mixtec/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/yoloxochitl_mixtec/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/yesno/asr1/run.sh | ESPnet_for_PyTorch/egs/yesno/tts1/run.sh | http://sourceforge.net/projects/kaldi/files/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/yesno/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/yesno/asr1/run.sh | ESPnet_for_PyTorch/egs/yesno/asr1/run.sh | http://sourceforge.net/projects/kaldi/files/waves_yesno.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/yesno/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/wsj_mix/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/open_li52/asr1/local/getdata.sh | ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/voxforge/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vivos/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/voc1/local/make_subset_data.sh | ESPnet_for_PyTorch/egs/vcc20/voc1/local/make_subset_data.sh | https://opensource.org/licenses/MIT | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vcc20/voc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_mandarin.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_german.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/vcc20/vc1_task2/local/clean_text_finnish.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vcc20/vc1_task2/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/README.md | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/recognize.sh | https://github.com/espnet/espnet/blob/master/egs/librispeech/asr1/RESULTS.md#pytorch-large-transformer-with-specaug-4-gpus--large-lstm-lm | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/local/clean_text_asr_result.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vcc20/vc1_task1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/local/clean_text_mailabs.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/local/clean_text_css10.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/vcc20/tts1_en_de/local/download.sh | ESPnet_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vcc20/tts1_en_de/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/vais1000/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tweb/README.md | ESPnet_for_PyTorch/egs/tweb/tts1/run.sh | https://www.kaggle.com/bryanpark/the-world-english-bible-speech-dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/tweb/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/run.sh | ESPnet_for_PyTorch/egs/timit_ssc/ssr1/run.sh | ftp://ftp.espci.fr/pub/sigma/Features/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/timit_ssc/ssr1/run.sh | ESPnet_for_PyTorch/egs/timit_ssc/ssr1/local/ssc_data_prepare.sh | ftp://ftp.espci.fr/pub/sigma/Features/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/timit_ssc/ssr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/timit/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/tedlium3/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/tedlium2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/tedlium2/align1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_fix_speakerid.pl | ESPnet_for_PyTorch/egs/swbd/asr1/local/swbd1_fix_speakerid.pl | pengqi@cs.stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_prep.sh | ESPnet_for_PyTorch/egs/swbd/asr1/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_for_PyTorch/egs/swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_for_PyTorch/egs/swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | ESPnet_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/ru_open_stt/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_SimData_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/score_RealData_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/prog/score_sim_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/evaltools/score_STOI_scp.m | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/evaltools/prog/score_real_scp.m | REVERB-challenge@lab.ntt.co.jp | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.org/tools/taskFiles_et.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/reverb/asr1_multich/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/RESULTS | ESPnet_for_PyTorch/egs/reverb/asr1/RESULTS | https://reverb2014.dereverberation.com/result_asr.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet_for_PyTorch/egs/reverb/asr1/local/prepare_simu_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet_for_PyTorch/egs/reverb/asr1/local/prepare_real_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/Generate_mcTrainData_cut.m | ESPnet_for_PyTorch/egs/reverb/asr1/local/Generate_mcTrainData_cut.m | http://stevem.us/fconv.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/reverb/asr1/local/prepare_real_data.sh | ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | https://github.com/kaldi-asr/kaldi/tree/master/egs/reverb/s5/local | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/reverb/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet_for_PyTorch/egs/puebla_nahuatl/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/puebla_nahuatl/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_for_PyTorch/egs/polyphone_swiss_french/asr1/local/data_prep.py | http://catalog.elra.info/en-us/repository/browse/ELRA-S0030_02 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/polyphone_swiss_french/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/must_c_v2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/must_c_v2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/must_c_v2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/must_c/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/must_c/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/must_c/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mtedx/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mtedx/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mini_an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/run.sh | ESPnet_for_PyTorch/egs/mgb2/asr1/run.sh | https://arabicspeech.org/mgb2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/mgb2/asr1/run.sh | ESPnet_for_PyTorch/egs/mgb2/asr1/local/mgb_extract_data.sh | https://arabicspeech.org/mgb2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mgb2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/mboshi_french/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/m_ailabs/tts1/local/download.sh | ESPnet_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/m_ailabs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/ljspeech/tts2/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts2/RESULTS | ESPnet_for_PyTorch/egs/ljspeech/tts2/RESULTS | https://drive.google.com/drive/folders/1AMAKY8uQY59-DL5KNjrPSoosu4sftJPn?usp=sharing | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/ljspeech/tts2/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/ljspeech/tts1/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ljspeech/tts1/local/data_download.sh | ESPnet_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/ljspeech/tts1/local/clean_text.py | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/ljspeech/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/README.md | ESPnet_for_PyTorch/egs/ljspeech/asr1/run.sh | https://github.com/Kyubyong/g2p | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/ljspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/libritts/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/librispeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet_for_PyTorch/egs/libri_trans/st1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/local/data_prep.sh | ESPnet_for_PyTorch/egs/libri_trans/st1/local/data_prep.sh | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/prepare-raw.sh | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/libri_trans/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet_for_PyTorch/egs/libri_trans/mt1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/libri_trans/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_trans/st1/run.sh | ESPnet_for_PyTorch/egs/libri_trans/asr1/run.sh | https://persyval-platform.univ-grenoble-alpes.fr/DS91/detaildataset | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/libri_trans/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/run.sh | ESPnet_for_PyTorch/egs/libri_css/asr1/run.sh | https://github.com/espnet/espnet/blob/master/egs/librispeech/asr1/RESULTS.md#pytorch-large-transformer-with-specaug-4-gpus--transformer-lm-4-gpus | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/libri_css/asr1/local/wer_output_filter | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/segmentation/apply_webrtcvad.py | ESPnet_for_PyTorch/egs/libri_css/asr1/local/segmentation/apply_webrtcvad.py | https://github.com/wiseman/py-webrtcvad/blob/master/example.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/tedlium2/align1/local/download_model.sh | ESPnet_for_PyTorch/egs/libri_css/asr1/local/download_asr.sh | https://drive.google.com/open?id=17cOOSHHMKI82e1MXj4r2ig8gpGCRmG2p | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/diarize.sh | ESPnet_for_PyTorch/egs/libri_css/asr1/local/diarize.sh | https://github.com/nryant/dscore | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | https://arxiv.org/abs/1910.08847 | 参考论文地址 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | ESPnet_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.sh | http://www.fit.vutbr.cz/research/groups/speech/publi/2019/diez_IEEE_ACM_2019_08910412.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/vb_hmm_xvector.py | ESPnet_for_PyTorch/egs/libri_css/asr1/diarization/vb_hmm_xvector.py | https://github.com/BUTSpeechFIT/VBx | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/diarization/VB_diarization.py | ESPnet_for_PyTorch/egs/libri_css/asr1/diarization/VB_diarization.py | burget@fit.vutbr. | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/libri_css/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/li42/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/li10/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_for_PyTorch/egs/ksponspeech/asr1/run.sh | https://aihub.or.kr/aidata/105 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/ksponspeech/asr1/local/get_space_normalized_hyps.py | ESPnet_for_PyTorch/egs/ksponspeech/asr1/local/get_space_normalized_hyps.py | https://github.com/kaldi-asr/kaldi/blob/master/src/bin/align-text.cc | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/ksponspeech/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jvs/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/jsut/tts1/local/prep_segments.py | ESPnet_for_PyTorch/egs/jsut/tts1/local/prep_segments.py | https://kaldi-asr.org/doc/data_prep.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jsut/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jsut/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jsalt18e2e/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/TEMPLATE/tts1/tts.sh | ESPnet_for_PyTorch/egs/jnas/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jnas/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jnas/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/jesc/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_for_PyTorch/egs/iwslt21_low_resource/st1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt21_low_resource/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_for_PyTorch/egs/iwslt21_low_resource/asr1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt21_low_resource/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt21/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt21/punc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt21/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt21/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt19/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt19/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/segmented/IWSLT-SLT.segmented.tst2019.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/download_and_untar.sh | ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/segmented/IWSLT-SLT.segmented.tst2020.en-de.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/iwslt18/st1/local/data_prep_train.sh | ESPnet_for_PyTorch/egs/iwslt18/st1/local/data_prep_eval.sh | https://drive.google.com/open?id=1agQOUEm47LIeLZAFF8RTZ5qx6OsOFGTM | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt18/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt18/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt18/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/iwslt16/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/hub4_spanish/asr1/local/write_kaldi_files.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_training_text.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_test_text.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/hub4_spanish/asr1/local/prepare_data.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/hub4_spanish/asr1/local/parse_sgm.pl | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/hub4_spanish/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/how2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/how2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/how2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/hkust/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/swbd1_data_prep.sh | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/swbd/asr1/local/eval2000_data_prep.sh | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/fisher_swbd/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/fisher_callhome_spanish/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/fisher_callhome_spanish/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/fisher_callhome_spanish/asr1b/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/fisher_callhome_spanish/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/readsph.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/readsph.m | http://www.ee.ic.ac.uk/hp/staff/dmb/voicebox/voicebox.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/readsph.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/readsph.m | http://www.gnu.org/copyleft/gpl.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/read_sphere.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/linear_shift.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/find_files.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/Data_Contamination.m | https://www.ldc.upenn.edu/language-resources/tools/sphere-conversion-tools | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/dirha_wsj/asr1/local/tools/linear_shift.m | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/tools/create_folder_str.m | mravanelli@fbk.eu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/dirha_wsj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/dipco/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/README.md | ESPnet_for_PyTorch/egs/csmsc/tts1/local/data_download.sh | https://www.data-baker.com/open_source.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/csmsc/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/csj/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/csj/align1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/local/process_tsv.py | ESPnet_for_PyTorch/egs/covost2/st1/local/process_tsv.py | https://github.com/facebookresearch/covost/blob/master/get_covost_splits.py | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/covost2/st1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/covost2/mt1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/covost2/st1/run.sh | ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/covost2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-3/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/commonvoice/asr1/local/data.sh | ESPnet_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/thchs30/tts1/local/download_and_untar.sh | ESPnet_for_PyTorch/egs/commonvoice/asr1/local/download_and_untar.sh | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/commonvoice/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_indic/tts1/local/data_download.sh | ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/cmu_indic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/chime6/asr1/local/wer_output_filter | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://spandh.dcs.shef.ac.uk/chime_challenge/data.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://spandh.dcs.shef.ac.uk/chime_workshop/papers/CHiME_2018_paper_boeddecker.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/install_pb_chime5.sh | ESPnet_for_PyTorch/egs/chime6/asr1/local/install_pb_chime5.sh | https://stackoverflow.com/a/3796947/5766934 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/chime6/asr1/local/generate_chime6_data.sh | ESPnet_for_PyTorch/egs/chime6/asr1/local/generate_chime6_data.sh | https://github.com/chimechallenge/chime6-synchronisation | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/libri_css/asr1/local/wer_output_filter | ESPnet_for_PyTorch/egs/chime6/asr1/local/check_tools.sh | jtrmal@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/chime6/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/chime5/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/chime4/asr1_multich/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/chime4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/blizzard17/tts1/run.sh | ESPnet_for_PyTorch/egs/blizzard17/tts1/run.sh | http://www.cstr.ed.ac.uk/projects/blizzard/2017/usborne_blizzard2017/license.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/blizzard17/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/babel/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/aurora4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/arctic/vc1/path.sh | ESPnet_for_PyTorch/egs/arctic/vc1/path.sh | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/arctic/vc1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/cmu_arctic/tts1/local/data_download.sh | ESPnet_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/arctic/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/an4/tts1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/an4/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_xml2text.sh | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/ | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs/ami/asr1/local/ami_beamform.sh | ESPnet_for_PyTorch/egs/ami/asr1/local/ami_beamform.sh | http://groups.inf.ed.ac.uk/ami/corpus/dataproblems.shtml | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/ami/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/aishell2/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/aishell/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/egs2/laborotv/asr1/cmd.sh | ESPnet_for_PyTorch/egs/aidatatang_200zh/asr1/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/9.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/9.2/devel/cudnn7/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/8.0/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/blob/master/dist/11.1.1/ubuntu20.04-x86_64/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/11.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/blob/master/dist/11.1.1/ubuntu20.04-x86_64/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/issues/88 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/runtime/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/devel/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.2/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://gitlab.com/nvidia/container-images/cuda/-/issues/88 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://gitlab.com/nvidia/cuda/blob/ubuntu18.04/10.1/base/Dockerfile | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/docker/prebuilt/devel/gpu/10.1/Dockerfile | ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://docs.nvidia.com/cuda/eula/index.html#attachment-a | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/.gitmodules | ESPnet_for_PyTorch/.gitmodules | https://github.com/espnet/notebook | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/10.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu2004/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/11.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2004/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/8.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.0/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64 / | cuda地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.1/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1604/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1604/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64 / | 下载相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/devel/gpu/9.2/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1710/x86_64/7fa2af81.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/local/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/runtime/Dockerfile | nyalta21@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/docker/prebuilt/runtime/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/aidatatang_200zh/asr1/run.sh | www.openslr.org/resources/62 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/aishell/asr1/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/ami/asr1/local/ami_download.sh | http://groups.inf.ed.ac.uk/ami/download/temp/amiBuild-04237-Sun-Jun-15-2014.manifest.txt | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/ami/asr1/local/ami_text_prep.sh | http://groups.inf.ed.ac.uk/ami | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/ami/asr1/local/ami_xml2text.sh | http://groups.inf.ed.ac.uk/ami/AMICorpusAnnotations/$annots.gzip | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/an4/asr1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/an4/tts1/run.sh | http://www.speech.cs.cmu.edu/databases/an4/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/arctic/tts1/local/data_download.sh | http://festvox.org/cmu_arctic/cmu_arctic/packed/cmu_us_${spk}_arctic-0.95-release.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/aurora4/asr1/local/aurora4_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/aurora4/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/blizzard17/tts1/local/download.sh | http://data.cstr.ed.ac.uk/blizzard2017-18/usborne/2018/2018_EH1/blizzard_release_2017_v2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/chime4/asr1/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_baseline_chime6_data.sh | http://www.openslr.org/resources/28/rirs_noises.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/chime6/asr1/local/prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict $dir/cmudict | 相关设置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/cmu_indic/tts1/local/data_download.sh | http://festvox.org/h2r_indic/cmu_indic_${spk}.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/commonvoice/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-5.1-2020-06-22/${lang}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/asr1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/mt1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost2.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://voice-prod-bundler-ee1969a6ce8178826482b88e843c335139bd3fb4.s3.amazonaws.com/cv-corpus-4-2019-12-10/${src_lang}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/covost2/st1/run.sh | https://dl.fbaipublicfiles.com/covost/covost_v2.${src_lang}_${tgt_lang}.tsv.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/csmsc/tts1/local/data_download.sh | https://www.data-baker.com/open_source.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/dipco/asr1/local/download_data.sh | https://s3.amazonaws.com/dipco/DiPCo.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/dirha_wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/fisher_swbd_prepare_dict.sh | https://svn.code.sf.net/p/cmusphinx/code/trunk/cmudict | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/fisher_swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~mmueller/iwslt-corpus.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2020.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2019.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2018.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2015.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2014.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2013.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.tst2010.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt18/st1/local/download_and_untar.sh | http://i13pc106.ira.uka.de/~jniehues/IWSLT-SLT/data/eval/en-de/preprocessed/IWSLT-SLT.dev2010.en-de.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset20m.incl_paracrawl.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt21/asr1/local/data_prep_wmt20.sh | https://www.cs.jhu.edu/~kevinduh/t/iwslt21/wmt20/wmt20-de-en-subset5m.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt21/asr1/run.sh | https://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt21_low_resource/asr1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/iwslt21_low_resource/st1/run.sh | https://iwslt.org/2021/low-resource | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/jesc/mt1/local/download_data.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/jnas/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/jsut/tts1/local/download.sh | http://ss-takashi.sakura.ne.jp/corpus/jsut_ver1.1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/libri_css/asr1/local/download_xvector.sh | http://kaldi-asr.org/models/12/0012_diarization_v1.tar.gz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/librispeech/asr1/run.sh | www.openslr.org/resources/12 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/librispeech/asr1/run.sh | http://www.openslr.org/resources/11/librispeech-lm-norm.txt.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/libritts/tts1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/libritts/tts1/run.sh | www.openslr.org/resources/60 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/ljspeech/tts1/local/data_download.sh | http://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/m_ailabs/tts1/local/download.sh | http://www.caito.de/data/Training/stt_tts/${lang}.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mgb2/asr1/local/mgb_extract_data.sh | https://arabicspeech.org/mgb2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mgb2/asr1/local/xml2stm.py | yzhang@qf.org.qa | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Odia_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Marathi_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask1/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi_test.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_test.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Hindi-English_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/mucs21_subtask2/asr1/local/download_data.sh | http://www.ee.iisc.ac.in/new/people/faculty/prasantg/downloads/Bengali-English_train.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/puebla_nahuatl/asr1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 Sound-Files-Puebla-Nahuatl.tgz.part0 9 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Puebla-Nahuatl-Manifest.tgz Puebla-Nahuatl-Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/SpeechTranslation_Nahuatl_Manifest.tgz SpeechTranslation_Nahuatl_Manifest.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/puebla_nahuatl/st1/run.sh | https://www.openslr.org/resources/92/Sound-Files-Puebla-Nahuatl.tgz.part0 Sound-Files-Puebla-Nahuatl.tgz.part0 9 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/reverb/asr1/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/taskFiles_et.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/REVERB_TOOLS_FOR_ASR_ver2.1.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/reverb/asr1/local/generate_data.sh | http://reverb2014.dereverberation.com/tools/reverb_tools_for_Generate_mcTrainData.tgz | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.org/tools/taskFiles_et.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/reverb/asr1_multich/local/download_se_eval_tool.sh | https://reverb2014.dereverberation.com/tools/REVERB-SPEENHA.Release04Oct.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/ru_open_stt/asr1/local/ru_open_stt_download_data.sh | https://raw.githubusercontent.com/snakers4/open_stt/4bff5470a29dcca5c7175fa3b6fd106c6151b756/${f} | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/swbd/asr1/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/tedlium2/align1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/tedlium2/asr1/local/download_data.sh | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/tedlium3/asr1/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_de/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_de/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/local/download.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_fi/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_csmsc.sh | https://weixinxcxdb.oss-cn-beijing.aliyuncs.com/gwYinPinKu/BZNSYP.rar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/local/download_mailabs.sh | http://data.solak.de/data/Training/stt_tts/${lang}.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/tts1_en_zh/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/vc1_task1/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/vcc20/vc1_task2/run.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Russian/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/Dutch/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/voxforge/asr1/local/getdata.sh | http://www.repository.voxforge1.org/downloads/$lang/Trunk/Audio/Main/16kHz_16bit | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/wsj/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/wsj_mix/asr1/local/wsj_data_prep.sh | https://catalog.ldc.upenn.edu/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/wsj_mix/asr1/local/wsj0_create_mixture.sh | http://www.merl.com/demos/deep-clustering/create-speaker-mixtures.zip | 下载工具脚本 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/yesno/asr1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/yesno/tts1/run.sh | http://www.openslr.org/resources/1/waves_yesno.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Manifest.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/egs/yoloxochitl_mixtec/asr1/run.sh | http://www.openslr.org/resources/89/Yoloxochitl-Mixtec-Data.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/espnet/nets/pytorch_backend/transducer/arguments.py | https://arxiv.org/abs/2010.11148 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/setup.py | shinjiw@ieee.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_k2.sh | https://k2-fsa.org/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_mwerSegmenter.sh | https://www-i6.informatik.rwth-aachen.de/web/Software/mwerSegmenter.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_nkf.sh | https://ja.osdn.net/dl/nkf/nkf-2.1.4.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_sctk.sh | ftp://jaguar.ncsl.nist.gov/pub/sctk-2.4.10-20151007-1312Z.tar.bz2 | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_sctk.sh | http://www.openslr.org/resources/4/sctk-2.4.10-20151007-1312Z.tar.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_sph2pipe.sh | http://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/installers/install_torch.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/utils/pack_model.sh | shinjiw@ieee.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/ESPnet_for_PyTorch/utils/synth_wav.sh | http://kaldi-asr.org/models/8/0008_sitw_v2_1a.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/public_address_statement.md b/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/public_address_statement.md index 432b1cb76ab478e553de68e5d31222f8cb2ca943..aad84a12dd91bafdb15e65e99414eb24b5e17050 100644 --- a/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/public_address_statement.md @@ -1,225 +1,109 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------ |------|--------------------------------|------------------|---------| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/sacrebleu.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/backtranslation/tokenized_bleu.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/byte_level_bpe/get_data.sh | Vgg_Transformer_for_PyTorch/examples/byte_level_bpe/get_data.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md | Vgg_Transformer_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh | https://github.com/facebookresearch/flores | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/laser/README.md | Vgg_Transformer_for_PyTorch/examples/criss/download_and_preprocess_tatoeba.sh | https://github.com/facebookresearch/LASER | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh | Vgg_Transformer_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh | https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/prepare-wikitext-103.sh | Vgg_Transformer_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | https://github.com/glample/fastBPE.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/rsennrich/wmt16-scripts.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/neubig/kytea.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://bitbucket.org/eunjeon/mecab-ko/downloads/mecab-0.996-ko-0.9.2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://bitbucket.org/eunjeon/mecab-ko-dic/downloads/mecab-ko-dic-2.1.1-20180720.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/anoopkunchukuttan/indic_nlp_resources.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/xlsr/README.md | Vgg_Transformer_for_PyTorch/examples/speech_to_text/prep_covost_data.py | https://github.com/facebookresearch/covost | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/prep_covost_data.py | Vgg_Transformer_for_PyTorch/examples/speech_to_text/prep_covost_data.py | https://dl.fbaipublicfiles.com/covost/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://wit3.fbk.eu/archive/2014-01/texts/de/en/de-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/de/en/de-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/byte_level_bpe/get_data.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation_moe/README.md | Vgg_Transformer_for_PyTorch/examples/translation_moe/score.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/adaptive_softmax.py | Vgg_Transformer_for_PyTorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fairseq_incremental_decoder.py | Vgg_Transformer_for_PyTorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py | Vgg_Transformer_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py | Vgg_Transformer_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/stories/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md | Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py | Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/adaptive_softmax.py | Vgg_Transformer_for_PyTorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/character_token_embedder.py | Vgg_Transformer_for_PyTorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md | Vgg_Transformer_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/dynamic_crf_layer.py | Vgg_Transformer_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/gelu.py | Vgg_Transformer_for_PyTorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md | Vgg_Transformer_for_PyTorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/sparse_multihead_attention.py | Vgg_Transformer_for_PyTorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/vggblock.py | Vgg_Transformer_for_PyTorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adafactor.py | Vgg_Transformer_for_PyTorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | Vgg_Transformer_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | Vgg_Transformer_for_PyTorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/bmuf.py | Vgg_Transformer_for_PyTorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | Vgg_Transformer_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | Vgg_Transformer_for_PyTorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | Vgg_Transformer_for_PyTorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | Vgg_Transformer_for_PyTorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md | Vgg_Transformer_for_PyTorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/cross_lingual_language_model/README.md | Vgg_Transformer_for_PyTorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md | Vgg_Transformer_for_PyTorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/translation_lev.py | Vgg_Transformer_for_PyTorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech_recognition/asr_test_base.py | Vgg_Transformer_for_PyTorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech_recognition/asr_test_base.py | Vgg_Transformer_for_PyTorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md | Vgg_Transformer_for_PyTorch/examples/criss/unsupervised_mt/eval.sh | https://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/latent_depth_src/multilingual_translation_latent_depth.py | Vgg_Transformer_for_PyTorch/examples/latent_depth/latent_depth_src/multilingual_translation_latent_depth.py | https://arxiv.org/pdf/2009.13102.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/tokenizers/tokenizer_ar.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/tokenizers/tokenizer_ar.sh | Vgg_Transformer_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh | Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh | Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | Vgg_Transformer_for_PyTorch/examples/pointer_generator/pointer_generator_src/transformer_pg.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh | Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/README.md | Vgg_Transformer_for_PyTorch/examples/pointer_generator/pointer_generator_src/transformer_pg.py | https://arxiv.org/abs/1704.04368 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/simultaneous_translation/utils/functions.py | Vgg_Transformer_for_PyTorch/examples/simultaneous_translation/utils/functions.py | https://arxiv.org/pdf/1712.05382.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/simultaneous_translation/utils/monotonic_attention.py | Vgg_Transformer_for_PyTorch/examples/simultaneous_translation/utils/latency.py | https://arxiv.org/abs/1906.05218 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_recognition/models/vggtransformer.py | Vgg_Transformer_for_PyTorch/examples/speech_recognition/models/vggtransformer.py | https://arxiv.org/abs/1904.11660 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation_moe/README.md | Vgg_Transformer_for_PyTorch/examples/translation_moe/translation_moe_src/logsumexp_moe.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation_moe/README.md | Vgg_Transformer_for_PyTorch/examples/translation_moe/translation_moe_src/translation_moe.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libnat_cuda/binding.cpp | Vgg_Transformer_for_PyTorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | Vgg_Transformer_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe_utils.py | Vgg_Transformer_for_PyTorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/speech_to_text_dataset.py | Vgg_Transformer_for_PyTorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py | Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/convolution.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md | Vgg_Transformer_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | Vgg_Transformer_for_PyTorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | Vgg_Transformer_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | Vgg_Transformer_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | Vgg_Transformer_for_PyTorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | Vgg_Transformer_for_PyTorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | Vgg_Transformer_for_PyTorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.gitmodules | Vgg_Transformer_for_PyTorch/.gitmodules | https://github.com/ngoyal2707/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | Vgg_Transformer_for_PyTorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | Vgg_Transformer_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | Vgg_Transformer_for_PyTorch/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | Vgg_Transformer_for_PyTorch/setup.py | https://github.com/pytorch/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py | Vgg_Transformer_for_PyTorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_speech/benchmarking/README.md | Vgg_Transformer_for_PyTorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py | Vgg_Transformer_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/file_utils.py | Vgg_Transformer_for_PyTorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/README.md | Vgg_Transformer_for_PyTorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md | Vgg_Transformer_for_PyTorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | Vgg_Transformer_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | Vgg_Transformer_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/make.bat | Vgg_Transformer_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | Vgg_Transformer_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/search.py | Vgg_Transformer_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | Vgg_Transformer_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | Vgg_Transformer_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | Vgg_Transformer_for_PyTorch/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | Vgg_Transformer_for_PyTorch/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | Vgg_Transformer_for_PyTorch/examples/latent_depth/latent_depth_src/models/latent_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | Vgg_Transformer_for_PyTorch/examples/latent_depth/latent_depth_src/modules/latent_layers.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | Vgg_Transformer_for_PyTorch/examples/latent_depth/latent_depth_src/models/latent_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/feature_transforms/specaugment.py | Vgg_Transformer_for_PyTorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/m2m_100/install_dependecies.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 工具下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s6.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/speech_recognition/datasets/prepare-librispeech.sh | www.openslr.org/resources/12 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/speech_to_text/prep_covost_data.py | https://dl.fbaipublicfiles.com/covost/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt22.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Vgg_Transformer_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/public_address_statement.md b/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/public_address_statement.md index bc472e56faf9102d8865601dd8d2824a8fbd5bb9..db23dc2ceaad78485d7487692608814d993e676e 100644 --- a/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/public_address_statement.md +++ b/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/public_address_statement.md @@ -1,315 +1,70 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------|---------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------| -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/examples/aishell/s0/run.sh | www.openslr.org/resources/33 | aishell开源数据集下载 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/android/app/src/main/AndroidManifest.xml | http://schemas.android.com/apk/res/android | 下载配置文件 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/android/app/src/main/AndroidManifest.xml | http://schemas.android.com/tools | 下载配置文件 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/binding/python/py/hub.py | https://github.com/wenet-e2e/wenet/releases/download/v2.0.1/chs.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/binding/python/py/hub.py | https://github.com/wenet-e2e/wenet/releases/download/v2.0.1/en.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/binding/python/setup.py | binbzha@qq.com | 邮箱 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/binding/python/setup.py | https://github.com/wenet-e2e/wenet | 下载源码 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/boost.cmake | https://boostorg.jfrog.io/artifactory/main/release/1.75.0/source/boost_1_75_0.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/bpu.cmake | https://github.com/xingchensong/toolchain_pkg/releases/download/easy_dnn/easy_dnn.0.4.11.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/gflags.cmake | https://github.com/gflags/gflags/archive/v2.2.2.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/glog.cmake | https://github.com/google/glog/archive/v0.4.0.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/grpc.cmake | https://github.com/grpc/grpc | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/gtest.cmake | https://github.com/google/googletest/archive/release-1.11.0.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | http://intel-optimized-pytorch.s3.cn-north-1.amazonaws.com.cn/libipex/cpu/libintel-ext-pt-cxx11-abi-1.13.100%2Bcpu.run | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | http://intel-optimized-pytorch.s3.cn-north-1.amazonaws.com.cn/libipex/cpu/libintel-ext-pt-1.13.100%2Bcpu.run | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-win-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-win-shared-with-deps-debug-1.13.0%2Bcpu.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cu113/libtorch-cxx11-abi-shared-with-deps-1.12.0%2Bcu113.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cu113/libtorch-shared-with-deps-1.11.0%2Bcu113.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-macos-1.13.0.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/onnx.cmake | https://github.com/microsoft/onnxruntime/releases/download/v${ONNX_VERSION}/onnxruntime-win-x64-${ONNX_VERSION}.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/onnx.cmake | https://github.com/microsoft/onnxruntime/releases/download/v${ONNX_VERSION}/onnxruntime-linux-aarch64-${ONNX_VERSION}.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/onnx.cmake | https://github.com/microsoft/onnxruntime/releases/download/v${ONNX_VERSION}/onnxruntime-linux-x64-${ONNX_VERSION}.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/onnx.cmake | https://github.com/microsoft/onnxruntime/releases/download/v${ONNX_VERSION}/onnxruntime-osx-x86_64-${ONNX_VERSION}.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openfst.cmake | https://github.com/kkm000/openfst/archive/refs/tags/win/1.6.5.1.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://github.com/openvinotoolkit/openvino.git | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/windows/w_openvino_toolkit_windows_${VINO_VERSION}.9052.9752fafe8eb_x86_64.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_ubuntu20_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_ubuntu18_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_centos7_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_rhel8_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_debian9_${VINO_VERSION}.9052.9752fafe8eb_arm64.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/macos/m_openvino_toolkit_macos_10_15_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/core/cmake/pybind11.cmake | https://github.com/pybind/pybind11/archive/refs/tags/v2.9.2.zip | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/cuda_decoders/run.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20211025_conformer_exp.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/cuda_decoders/run.sh | https://huggingface.co/yuekai/aishell1_tlg_essentials.git | 下载源码 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/Dockerfile/Dockerfile.server | https://github.com/Slyne/ctc_decoder.git | 下载源码 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/scripts/run_qa.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20211025_conformer_exp.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt/requirements.txt | https://pypi.ngc.nvidia.com | 下载第三方库 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt/run_streaming_small_model.sh | http://mobvoi-speech-public.ufile.ucloud.cn/public/wenet/aishell2/20210618_u2pp_conformer_exp.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt_fastertransformer/run.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20211025_conformer_exp.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt_fastertransformer/run_large.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/wenetspeech/20211025_conformer_exp.tar.gz | 下载预训练模型 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/libtorch/docker/Dockerfile | zhendong.peng@qq.com | 邮箱 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/libtorch/docker/Dockerfile | https://github.com/wenet-e2e/wenet.git | 下载源码 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/runtime/libtorch/docker/Dockerfile | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell2/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/tools/install_srilm.sh | https://github.com/BitSpeech/SRILM/archive/refs/tags/1.7.3.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/tools/k2/make_hlg.sh | https://github.com/k2-fsa/icefall.git | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/tools/k2/prepare_mmi.sh | https://github.com/k2-fsa/icefall.git | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载依赖 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/tools/sph2wav.sh | https://www.openslr.org/resources/3/sph2pipe_${sph2pipe_version}.tar.gz | 下载工具脚本 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git | Wenet_Conformer_for_Pytorch/tools/sph2wav.sh | https://sourceforge.net/projects/kaldi/files/sph2pipe_${sph2pipe_version}.tar.gz | 下载工具脚本 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/decoder/onnx_asr_model.h | Wenet_Conformer_for_Pytorch/runtime/core/decoder/onnx_asr_model.cc | lizexuan@huya.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/swbd/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/decoder/ctc_prefix_beam_search.cc | Wenet_Conformer_for_Pytorch/runtime/core/decoder/ctc_prefix_beam_search.cc | https://robin1001.github.io/2020/12/11/ctc-search | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/squeezeformer/subsampling.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/subsampling.py | https://github.com/kssteven418/Squeezeformer | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/validate_text.pl | Wenet_Conformer_for_Pytorch/tools/validate_dict_dir.pl | jtrmal@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_asr_model.h | yanzikui@baidu.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/wenetspeech/s0/run.sh | https://github.com/wenet-e2e/WenetSpeech | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/wenet/dataset/processor.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell2/s0/local/train_lms.sh | Wenet_Conformer_for_Pytorch/examples/aishell/s0/local/aishell_train_lms.sh | http://www.speech.sri.com/projects/srilm/download.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/validate_data_dir.sh | Wenet_Conformer_for_Pytorch/examples/timit/local/validate_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | http://www.openslr.org/resources/11/librispeech-lexicon.txt | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/search/ctc_prefix_score.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/encoder.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/fst/prepare_dict.py | Wenet_Conformer_for_Pytorch/tools/fst/prepare_dict.py | https://github.com/wenet-e2e/wenet/pull/1693 | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/cli/hub.py | Wenet_Conformer_for_Pytorch/runtime/core/decoder/onnx_asr_model.cc | hamddct@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist | http://www.apple.com/DTDs/PropertyList-1.0.dtd | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | http://www.openslr.org/resources/11/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/encoder.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | andrew.c.morrow@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/squeezeformer/subsampling.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/encoder.py | https://github.com/kssteven418/Squeezeformer | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/subsampling.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/utils/json.h | Wenet_Conformer_for_Pytorch/runtime/core/utils/json.h | https://github.com/nbsdx/SimpleJSON | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/core/toolchains/ios.toolchain.cmake | https://github.com/gerstrong/ios-cmake.git | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/aishell2/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/cif/predictor.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/asr_model.py | Wenet_Conformer_for_Pytorch/wenet/transformer/asr_model.py | https://github.com/wenet-e2e/wenet/issues/1113 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | zhangxiongpang@gmail.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/aishell/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | bruce.mitchener@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/android/gradle.properties | Wenet_Conformer_for_Pytorch/runtime/android/gradle.properties | https://developer.android.com/topic/libraries/support-library/androidx-rn | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/gigaspeech/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | mimomorin@gmail.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/image/voice-dictation.svg | http://www.w3.org/2000/svg","http://www.w3.org/1999/xlink | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell2/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/aishell2/s0/run.sh | http://aishell-eval.oss-cn-beijing.aliyuncs.com/TEST%26DEV%20DATA.zip | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/decoder/onnx_asr_model.h | Wenet_Conformer_for_Pytorch/runtime/core/decoder/onnx_asr_model.h | lizexuan@huya.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/android/app/src/test/java/com/mobvoi/wenet/ExampleUnitTest.java | http://d.android.com/tools/testing | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/paraformer.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/tedlium3/s0/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/cmake/bpu.cmake | Wenet_Conformer_for_Pytorch/runtime/core/cmake/bpu.cmake | https://stackoverflow.com/questions/59915966/unknown-gcc-linker-error-but-builds-sucessfully/59916438#59916438 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/branchformer/cgmlp.py | Wenet_Conformer_for_Pytorch/wenet/branchformer/cgmlp.py | https://openreview.net/forum?id=RA-zVvZLYIy | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/k2/prepare_mmi.sh | Wenet_Conformer_for_Pytorch/tools/k2/make_hlg.sh | https://github.com/k2-fsa/k2/ | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/conformer_test.cpp | lichaolin@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/utils/utils.cc | Wenet_Conformer_for_Pytorch/runtime/core/utils/utils.cc | https://github.com/pytorch/pytorch/blob/master/caffe2/operators/top_k.cc | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/core/toolchains/ios.toolchain.cmake | https://github.com/leetal/ios-cmake.git | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/android/gradle.properties | Wenet_Conformer_for_Pytorch/runtime/android/gradle.properties | http://www.gradle.org/docs/current/userguide/multi_project_builds.html#sec:decoupled_projects | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/aishell2/s0/run.sh | https://github.com/aishell-foundation/DaCiDian.git | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/core/post_processor/post_processor.h | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/subset_data_dir.sh | Wenet_Conformer_for_Pytorch/tools/subset_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html#data_prep_data | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.cpp | lichaolin@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/attention.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_asr_model.cc | qihan@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/commonvoice/fr/local/download_data.sh | Wenet_Conformer_for_Pytorch/examples/commonvoice/fr/local/download_data.sh | https://mozilla-common-voice-datasets.s3.dualstack.us-west-2.amazonaws.com/cv-corpus-8.0-2022-01-19/cv-corpus-8.0-2022-01-19-fr.tar.gz | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/aishell4/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/utils/common.py | https://github.com/espnet/espnet | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/wsj/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/compute-wer.py | Wenet_Conformer_for_Pytorch/tools/compute-wer.py | https://unicodebook.readthedocs.io/unicode.html#unicode-categories | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/frontend/fbank.h | Wenet_Conformer_for_Pytorch/runtime/core/frontend/fbank.h | https://github.com/kaldi-asr/kaldi/blob/master/src/feat/feature-fbank.cc | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | holgerar@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell4/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/aishell4/s0/run.sh | https://www.openslr.org/resources/111 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/fst/prepare_dict.py | Wenet_Conformer_for_Pytorch/tools/fst/prepare_dict.py | https://github.com/wenet-e2e/wenet/issues/1653 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | csilvers@google.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/convolution.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | jyasskin@google.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/embedding.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/attention.py | https://arxiv.org/abs/1901.02860 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/tools/setup_anaconda.sh | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/post_processor/post_processor.h | Wenet_Conformer_for_Pytorch/runtime/core/decoder/asr_decoder.cc | https://github.com/wenet-e2e/wenet/issues/583#issuecomment-907994058 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/swbd/s0/local/eval2000_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/CatalogEntry.jsp?catalogId=LDC2002T43 | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_asr_model.cc | panhehe@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/encoder_layer.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/spm_train | Wenet_Conformer_for_Pytorch/tools/spm_encode | https://github.com/pytorch/fairseq/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.cpp | yanzikui@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/gpu/cuda_decoders/run_streaming.sh | Wenet_Conformer_for_Pytorch/runtime/gpu/cuda_decoders/run.sh | https://github.com/triton-inference-server/server/blob/main/docs/user_guide/model_configuration.md | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/patch/openfst/src/lib/flags.cc | Wenet_Conformer_for_Pytorch/runtime/core/patch/openfst/src/lib/flags.cc | https://github.com/kkm000/openfst/pull/23 | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/parse-options.cc | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/parse-options.cc | http://www.redhat.com/mirrors/LDP/LDP/abs/html/quotingvar.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | jyasskin@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/compute-wer.py | Wenet_Conformer_for_Pytorch/tools/compute-cer.py | https://unicodebook.readthedocs.io/unicode.html#unicode-categories | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/cmake/bpu.cmake | Wenet_Conformer_for_Pytorch/runtime/core/cmake/bpu.cmake | https://github.com/tensorflow/tensorflow/issues/47849 | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/tedlium3/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/encoder.py | https://github.com/NVIDIA/NeMo | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/librispeech/s0/run.sh | http://www.openslr.org/resources/11/librispeech-lexicon.txt | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/librispeech/rnnt/run.sh | Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | https://us.openslr.org/resources/12 | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/aishell2/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/docs/make.bat | Wenet_Conformer_for_Pytorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | hhinnant@apple.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | joerg@NetBSD.org | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/.gitmodules | Wenet_Conformer_for_Pytorch/.gitmodules | https://github.com/NVIDIA/FasterTransformer.git | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transducer/predictor.py | Wenet_Conformer_for_Pytorch/wenet/transducer/predictor.py | https://arxiv.org/pdf/2109.07513.pdf | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/bin/alignment.py | Wenet_Conformer_for_Pytorch/wenet/bin/alignment.py | https://www.fon.hum.uva.nl/praat/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/libtorch/web/static/fonts/fontawesome-webfont.svg | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.svg | http://www.w3.org/Graphics/SVG/1.1/DTD/svg11.dtd | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/transformer/encoder_layer.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/core/toolchains/ios.toolchain.cmake | https://code.google.com/p/ios-cmake/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/convolution.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.h | yanzikui@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/core/test/post_processor_test.cc | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/core/decoder/onnx_asr_model.h | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/squeezeformer/subsampling.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/subsampling.py | https://github.com/upskyy/Squeezeformer | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/post_processor/post_processor.h | Wenet_Conformer_for_Pytorch/runtime/core/post_processor/post_processor.h | https://github.com/wenet-e2e/wenet/issues/583#issuecomment-907994058 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.cpp | panhehe@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/hkust/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell2/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/aishell2/rnnt/run.sh | http://aishell-eval.oss-cn-beijing.aliyuncs.com/TEST%26DEV%20DATA.zip | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/aishell2/rnnt/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/efficient_conformer/encoder.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/encoder.py | https://github.com/burchim/EfficientConformer | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/bin/export_onnx_cpu.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/validate_data_dir.sh | Wenet_Conformer_for_Pytorch/tools/validate_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/core/post_processor/post_processor.cc | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/swbd/s0/local/swbd1_data_download.sh | Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/efficient_conformer/encoder.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/attention.py | https://arxiv.org/abs/2109.01163 | 论文地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/swbd/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | nico.rieck@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/validate_data_dir.sh | Wenet_Conformer_for_Pytorch/examples/aishell4/s0/local/validate_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/squeezeformer/subsampling.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/encoder.py | https://github.com/upskyy/Squeezeformer | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/gigaspeech/s0/run.sh | gigaspeech@speechcolab.orgf | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/branchformer/cgmlp.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/branchformer/cgmlp.py | Wenet_Conformer_for_Pytorch/wenet/branchformer/cgmlp.py | https://arxiv.org/abs/2105.08050 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/patch/openfst/src/lib/flags.cc | Wenet_Conformer_for_Pytorch/runtime/core/patch/openfst/src/lib/flags.cc | https://github.com/kkm000/openfst/pull/32 | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/paraformer.py | https://arxiv.org/pdf/2206.08317.pdf | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/efficient_conformer/encoder.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/encoder.py | https://arxiv.org/abs/2109.01163 | 论文地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/librispeech/s0/run.sh | http://www.openslr.org/resources/11/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/utils/string.cc | Wenet_Conformer_for_Pytorch/runtime/core/utils/string.cc | https://github.com/wenet-e2e/wenet/issues/745 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/subsampling.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.cpp | qihan@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/gpu/scripts/benchmark_onnx_throughput.py | Wenet_Conformer_for_Pytorch/runtime/gpu/scripts/benchmark_onnx_throughput.py | https://github.com/microsoft/onnxruntime/blob/main/onnxruntime/python/tools/transformers/onnx_exporter.py | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/core/decoder/torch_asr_model.cc | https://pytorch.org/docs/stable/notes/cpu_ | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/aishell/paraformer/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/conformer_test.cpp | qihan@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/decoder/asr_model.h | Wenet_Conformer_for_Pytorch/runtime/core/decoder/asr_model.h | binbin.zhang@horizon.ai | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/subset_data_dir.sh | Wenet_Conformer_for_Pytorch/tools/combine_data.sh | http://kaldi-asr.org/doc/data_prep.html#data_prep_data | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/executor.py | Wenet_Conformer_for_Pytorch/wenet/utils/executor.py | https://pytorch.org/docs/stable/notes/amp_examples.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell4/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/aishell4/s0/local/download_and_untar.sh | https://www.openslr.org/resources/111 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/AppDelegate.swift | 1067837450@qq.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/asr_model.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/dataset/processor.py | Wenet_Conformer_for_Pytorch/wenet/dataset/processor.py | https://arxiv.org/abs/2106.05642 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/docs/conf.py | Wenet_Conformer_for_Pytorch/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | dimitry@andric.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.h | panhehe@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/swbd/s0/local/swbd1_fix_speakerid.pl | Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/swbd1_fix_speakerid.pl | pengqi@cs.stanford.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/validate_text.pl | Wenet_Conformer_for_Pytorch/examples/aishell4/s0/local/validate_text.pl | jtrmal@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_asr_model.h | panhehe@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/transformer/attention.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_asr_model.cc | yanzikui@baidu.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/aishell4/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/horizonbpu/bpu/bpu_asr_model.cc | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.eot | http://fontawesome.io/license/ | license地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/openasr2021/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/openasr2021/s0/run.sh | https://arxiv.org/pdf/2107.04734.pdf | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/csj/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/csj/s0/run.sh | https://ccd.ninjal.ac.jp/csj/en/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.h | yanzikui@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/vkw2021/s0/local/run_finetune_5h.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/convolution.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | tuhertz@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transducer/predictor.py | Wenet_Conformer_for_Pytorch/wenet/transducer/predictor.py | https://github.com/Mddct/neural-lm/blob/main/models/gru_cell.py | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/utils.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/embedding.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/attention.py | https://arxiv.org/abs/1901.02860 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/efficient_conformer/encoder_layer.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/spm_train | Wenet_Conformer_for_Pytorch/tools/spm_decode | https://github.com/pytorch/fairseq/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/decoder/ctc_prefix_beam_search.cc | Wenet_Conformer_for_Pytorch/runtime/core/test/ctc_prefix_beam_search_test.cc | https://robin1001.github.io/2020/12/11/ctc-search | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.h | qihan@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/chime4/s0/local/clean_wsj0_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj0_data_prep.sh | https://sourceforge.net/projects/kaldi/files/wsj0-train-spkrinfo.txt | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | mclow.lists@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/swbd/s0/local/swbd1_data_download.sh | Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/utils/string.h | Wenet_Conformer_for_Pytorch/runtime/core/utils/string.h | https://en.wikipedia.org/wiki/UTF-8#Encoding | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/android/app/src/androidTest/java/com/mobvoi/wenet/ExampleInstrumentedTest.java | http://d.android.com/tools/testing | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/decoder.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | lichray@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/SceneDelegate.swift | 1067837450@qq.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/embedding.py | Wenet_Conformer_for_Pytorch/wenet/transformer/attention.py | https://arxiv.org/abs/1901.02860 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/chime4/s0/local/clean_wsj1_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj1_data_prep.sh | https://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/paraformer.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/chime4/s0/local/clean_wsj0_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.h | lichaolin@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/k2/prepare_mmi.sh | Wenet_Conformer_for_Pytorch/tools/k2/prepare_mmi.sh | https://github.com/k2-fsa/k2/ | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/tedlium3/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/bin/export_onnx_bpu.py | Wenet_Conformer_for_Pytorch/tools/onnx2horizonbin.py | sxc19@tsinghua.org.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/search/beam_search.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/conformer_test.cpp | panhehe@baidu.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/wenet/paraformer/search/ctc.py | https://arxiv.org/abs/2006.14941 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/librispeech/rnnt/run.sh | Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | https://openslr.elda.org/resources/12 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/android/gradle.properties | Wenet_Conformer_for_Pytorch/runtime/android/gradle.properties | http://www.gradle.org/docs/current/userguide/build_environment.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.cpp | qihan@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/embedding.py | Wenet_Conformer_for_Pytorch/wenet/transformer/embedding.py | https://arxiv.org/abs/1901.02860 | 论文地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/aishell/NST/run_nst.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/core/toolchains/ios.toolchain.cmake | https://github.com/cristeab/ios-cmake.git | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | http://llvm.org | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/android/app/proguard-rules.pro | Wenet_Conformer_for_Pytorch/runtime/android/app/proguard-rules.pro | http://developer.android.com/guide/developing/tools/proguard.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/decode.sh | Wenet_Conformer_for_Pytorch/tools/decode.sh | binbinzhang@mobvoi.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | matthew@dempsky.org | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.cpp | panhehe@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/toolchains/ios.toolchain.cmake | Wenet_Conformer_for_Pytorch/runtime/core/toolchains/ios.toolchain.cmake | https://github.com/leetal/ios-cmake | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo.xcodeproj/project.xcworkspace/xcshareddata/IDEWorkspaceChecks.plist | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/Info.plist | http://www.apple.com/DTDs/PropertyList-1.0.dtd | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | st@quanttec.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/embedding.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell2/s0/local/train_lms.sh | Wenet_Conformer_for_Pytorch/examples/aishell2/s0/local/train_lms.sh | http://www.speech.sri.com/projects/srilm/download.html | 相关说明 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/hkust/s0/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/branchformer/encoder_layer.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/cli/hub.py | Wenet_Conformer_for_Pytorch/runtime/core/decoder/onnx_asr_model.h | hamddct@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/chime4/s0/local/clean_wsj1_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj0_data_prep.sh | https://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 数据集地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.h | panhehe@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/swbd/s0/local/eval2000_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/eval2000_data_prep.sh | http://www.ldc.upenn.edu/Catalog/catalogEntry.jsp?catalogId=LDC2002S09 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/gigaspeech/s0/run.sh | Wenet_Conformer_for_Pytorch/examples/gigaspeech/s0/run.sh | https://github.com/SpeechColab/GigaSpeech | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/bin/export_onnx_bpu.py | Wenet_Conformer_for_Pytorch/wenet/bin/export_onnx_bpu.py | sxc19@tsinghua.org.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/spm_train | Wenet_Conformer_for_Pytorch/tools/spm_train | https://github.com/pytorch/fairseq/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/subsampling.py | https://github.com/NVIDIA/NeMo | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/gpu/cuda_decoders/model_repo_cuda_decoder/scoring/1/model.py | Wenet_Conformer_for_Pytorch/runtime/gpu/cuda_decoders/model_repo_cuda_decoder/scoring/1/model.py | https://github.com/k2-fsa/icefall/blob/master/icefall/decode.py#L1072-L1075 | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/libtorch/web/static/image/voice-dictation.svg | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/image/voice-dictation.svg | https://sketchapp.com | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | 1067837450@qq.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | compnerd@compnerd.org | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/commonvoice/fr/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/gpu/client/decode_manifest_triton.py | Wenet_Conformer_for_Pytorch/runtime/gpu/client/decode_manifest_triton.py | https://huggingface.co/csukuangfj/aishell-test-dev-manifests | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell2/rnnt/run.sh | Wenet_Conformer_for_Pytorch/examples/aishell2/rnnt/run.sh | boji123@aliyun.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/wenet/utils/file_utils.py | https://github.com/wenet-e2e/wenet/pull/819 | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/search/scorer_interface.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/FontAwesome.otf | http://fontawesome.iohttp://fontawesome.io/license/ | license地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/binding/python/setup.py | https://github.com/wenet-e2e/wenet/issues/new | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/tools/validate_text.pl | Wenet_Conformer_for_Pytorch/tools/validate_text.pl | jtrmal@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/utils/scheduler.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/squeezeformer/attention.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/ctc.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/dataset/processor.py | Wenet_Conformer_for_Pytorch/wenet/dataset/processor.py | https://arxiv.org/abs/2211.00522 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | william.w.fisher@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/cli/hub.py | Wenet_Conformer_for_Pytorch/runtime/binding/python/py/hub.py | hamddct@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/utils/scheduler.py | https://github.com/NVIDIA/NeMo | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/libtorch/web/app.py | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/app.py | zhendong.peng@mobvoi.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/utils/scheduler.py | https://arxiv.org/abs/2206.00888 | 论文地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/wenet/wenet.h | 1067837450@qq.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/patch/openfst/src/lib/flags.cc | Wenet_Conformer_for_Pytorch/runtime/core/patch/openfst/src/lib/flags.cc | https://github.com/kkm000/openfst/issues/20 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | xingxue@ca.ibm.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/websocket/websocket_client.cc | Wenet_Conformer_for_Pytorch/runtime/core/websocket/websocket_client.cc | https://tools.ietf.org/html/rfc7230#section-5.4 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | http://libcxx.llvm.org/ | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/toolchains/ios.toolchain.cmake | Wenet_Conformer_for_Pytorch/runtime/core/toolchains/ios.toolchain.cmake | alexs.mac@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/paraformer/paraformer.py | Wenet_Conformer_for_Pytorch/wenet/paraformer/search/ctc.py | https://github.com/alibaba-damo-academy/FunASR | 源码实现 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.eot | http://fontawesome.io | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/wenet/transformer/encoder.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.h | lichaolin@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/test/test_file_utils.py | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_asr_model.h | qihan@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.cpp | yanzikui@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/decoder/asr_model.h | Wenet_Conformer_for_Pytorch/runtime/core/decoder/asr_model.cc | binbin.zhang@horizon.ai | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/branchformer/encoder.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/utils/scheduler.py | Wenet_Conformer_for_Pytorch/wenet/transformer/encoder_layer.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/bin/export_onnx_cpu.py | Wenet_Conformer_for_Pytorch/wenet/bin/export_onnx_cpu.py | https://github.com/wenet-e2e/wenet/pull/1174 | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_util.cpp | lichaolin@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/wenet/wenet.mm | 1067837450@qq.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | marshall@idio.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | kyrtzidis@apple.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/core/decoder/onnx_asr_model.cc | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/swbd/s0/local/swbd1_data_prep.sh | Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/swbd1_data_prep.sh | http://www.ldc.upenn.edu/Catalog/desc/addenda/swb-multi-annot.summary | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/embedding.py | Wenet_Conformer_for_Pytorch/wenet/transformer/embedding.py | https://github.com/pytorch/pytorch/issues/69434 | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/conformer_test.cpp | yanzikui@baidu.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/core/kaldi/util/basic-filebuf.h | Wenet_Conformer_for_Pytorch/runtime/core/kaldi/util/basic-filebuf.h | breese@users.sourceforge.net | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/dataset/kaldi_io.py | Wenet_Conformer_for_Pytorch/wenet/dataset/kaldi_io.py | https://github.com/kaldi-asr/kaldi/blob/master/src/matrix/compressed-matrix.h | 源码实现 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/aishell/rnnt/run.sh | Wenet_Conformer_for_Pytorch/examples/aishell/rnnt/run.sh | binbizha@qq.com | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/examples/vkw2021/s0/local/run_finetune_5h.sh | Wenet_Conformer_for_Pytorch/examples/vkw2021/s0/run.sh | https://pytorch.org/tutorials/intermediate/dist_tuto.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/wenet/transformer/encoder_layer.py | Wenet_Conformer_for_Pytorch/runtime/horizonbpu/bpu/bpu_asr_model.h | sxc19@mails.tsinghua.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/kunlun/xpu/xpu_util.h | Wenet_Conformer_for_Pytorch/runtime/kunlun/xpu/xpu_conformer.h | qihan@baidu.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/examples/timit/run.sh | https://docs.nvidia.com/deeplearning/nccl/user-guide/docs/env.html | 相关说明 | -| 开源代码引入 | https://github.com/wenet-e2e/wenet.git/runtime/ios/WenetDemo/WenetDemo/wenet/WenetDemo-Bridging-Header.h | Wenet_Conformer_for_Pytorch/runtime/ios/WenetDemo/WenetDemo/ViewController.swift | 1067837450@qq.com | 邮箱地址 | -| 开发引入 | / | Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt/requirements.txt | https://pypi.ngc.nvidia.com | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/.github/workflows/android.yml | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20210601_u2%2B%2B_conformer_libtorch.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell/paraformer/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell/rnnt/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell/s0/local/aishell_train_lms.sh | http://www.speech.sri.com/projects/srilm/download.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell/s0/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell/s0/run_whisper.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell2/s0/local/train_lms.sh | http://www.speech.sri.com/projects/srilm/download.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell4/s0/local/validate_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/aishell4/s0/run.sh | https://www.openslr.org/resources/111 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj0_data_prep.sh | http://www.ldc.upenn.edu/Catalog/docs/LDC93S6A/wsj0-train-spkrinfo.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj0_data_prep.sh | https://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/chime4/s0/local/clean_wsj1_data_prep.sh | https://www.openslr.org/resources/3/sph2pipe_v2.5.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/commonvoice/fr/local/download_data.sh | https://mozilla-common-voice-datasets.s3.dualstack.us-west-2.amazonaws.com/cv-corpus-8.0-2022-01-19/cv-corpus-8.0-2022-01-19-fr.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | www.openslr.org/resources/12 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | http://www.openslr.org/resources/11/librispeech-lexicon.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | http://www.openslr.org/resources/11/${which_lm} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/rnnt/run.sh | https://openslr.elda.org/resources/12 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/s0/run.sh | www.openslr.org/resources/12 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/s0/run.sh | http://www.openslr.org/resources/11/librispeech-lexicon.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/librispeech/s0/run.sh | http://www.openslr.org/resources/11/${which_lm} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | www.openslr.org/resources/68 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | www.openslr.org/resources/62 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | www.openslr.org/resources/18 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | www.openslr.org/resources/38 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | www.openslr.org/resources/47 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/multi_cn/s0/run.sh | www.openslr.org/resources/33 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/openasr2021/s0/local/prepare_data.sh | https://www.openslr.org/resources/3/sph2pipe_${sph2pipe_version}.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/openasr2021/s0/run.sh | www.openslr.org/resources/33 | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/swbd1_data_download.sh | http://www.openslr.org/resources/5/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/swbd/s0/local/swbd1_data_download.sh | http://www.isip.piconepress.com/projects/switchboard/releases/switchboard_word_alignments.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/tedlium3/s0/local/download_data.sh | http://www.openslr.org/resources/51/TEDLIUM_release-3.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/timit/local/timit_data_prep.sh | https://www.openslr.org/resources/3/sph2pipe_${sph2pipe_version}.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/examples/timit/local/validate_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/binding/python/setup.py | binbzha@qq.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/boost.cmake | https://boostorg.jfrog.io/artifactory/main/release/1.75.0/source/boost_1_75_0.tar.gz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | http://intel-optimized-pytorch.s3.cn-north-1.amazonaws.com.cn/libipex/cpu/libintel-ext-pt-cxx11-abi-1.13.100%2Bcpu.run | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/ipex.cmake | http://intel-optimized-pytorch.s3.cn-north-1.amazonaws.com.cn/libipex/cpu/libintel-ext-pt-1.13.100%2Bcpu.run | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-macos-1.13.0.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-win-shared-with-deps-debug-1.13.0%2Bcpu.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-win-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cu113/libtorch-shared-with-deps-1.11.0%2Bcu113.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cu113/libtorch-cxx11-abi-shared-with-deps-1.12.0%2Bcu113.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/libtorch.cmake | https://download.pytorch.org/libtorch/cpu/libtorch-cxx11-abi-shared-with-deps-1.13.0%2Bcpu.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/windows/w_openvino_toolkit_windows_${VINO_VERSION}.9052.9752fafe8eb_x86_64.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/macos/m_openvino_toolkit_macos_10_15_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_ubuntu20_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_ubuntu18_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_rhel8_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_debian9_${VINO_VERSION}.9052.9752fafe8eb_arm64.tgz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/core/cmake/openvino.cmake | https://storage.openvinotoolkit.org/repositories/openvino/packages/2022.3/linux/l_openvino_toolkit_centos7_${VINO_VERSION}.9052.9752fafe8eb_x86_64.tgz | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/gpu/cuda_decoders/run.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20211025_conformer_exp.tar.gz | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/gpu/scripts/run_qa.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20211025_conformer_exp.tar.gz | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt/run_streaming_small_model.sh | http://mobvoi-speech-public.ufile.ucloud.cn/public/wenet/aishell2/20210618_u2pp_conformer_exp.tar.gz | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt_fastertransformer/run.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell/20211025_conformer_exp.tar.gz | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/gpu/tensorrt_fastertransformer/run_large.sh | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/wenetspeech/20211025_conformer_exp.tar.gz | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/docker/Dockerfile | zhendong.peng@qq.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/docker/Dockerfile | https://wenet-1256283475.cos.ap-shanghai.myqcloud.com/models/aishell2/$model | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.eot | http://fontawesome.io/license/ | license地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.eot | http://fontawesome.io | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.ttf | http://fontawesome.io/license/ | license地址 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/fonts/fontawesome-webfont.ttf | http://fontawesome.io | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/runtime/libtorch/web/static/js/recorder/engine/mp3.js | http://www.mp3dev.org/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/tools/setup_anaconda.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/tools/sph2wav.sh | https://www.openslr.org/resources/3/sph2pipe_${sph2pipe_version}.tar.gz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/audio/Wenet_Conformer_for_Pytorch/tools/validate_data_dir.sh | http://kaldi-asr.org/doc/data_prep.html | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/autonoumous_driving/BEVFormer/README_ORI.md b/PyTorch/built-in/autonoumous_driving/BEVFormer/README_ORI.md index 402b87eeacb7edda9d56d26498babfc66874d436..59908d966536d357fbb2fbd221ef0b53e0731506 100644 --- a/PyTorch/built-in/autonoumous_driving/BEVFormer/README_ORI.md +++ b/PyTorch/built-in/autonoumous_driving/BEVFormer/README_ORI.md @@ -38,19 +38,19 @@ The proposed approach achieves the new state-of-the-art **56.9\%** in terms of N # Model Zoo -| Backbone | Method | Lr Schd | NDS| mAP|memroy | Config | Download | -| :---: | :---: | :---: | :---: | :---:|:---:| :---: | :---: | -| R50 | BEVFormer-tiny_fp16 | 24ep | 35.9|25.7 | - |[config](projects/configs/bevformer_fp16/bevformer_tiny_fp16.py) |[model](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_tiny_fp16_epoch_24.pth)/[log](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_tiny_fp16_epoch_24.log) | -| R50 | BEVFormer-tiny | 24ep | 35.4|25.2 | 6500M |[config](projects/configs/bevformer/bevformer_tiny.py) |[model](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_tiny_epoch_24.pth)/[log](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_tiny_epoch_24.log) | -| [R101-DCN](https://github.com/zhiqi-li/storage/releases/download/v1.0/r101_dcn_fcos3d_pretrain.pth) | BEVFormer-small | 24ep | 47.9|37.0 | 10500M |[config](projects/configs/bevformer/bevformer_small.py) |[model](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_small_epoch_24.pth)/[log](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_small_epoch_24.log) | -| [R101-DCN](https://github.com/zhiqi-li/storage/releases/download/v1.0/r101_dcn_fcos3d_pretrain.pth) | BEVFormer-base | 24ep | 51.7|41.6 |28500M |[config](projects/configs/bevformer/bevformer_base.py) | [model](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_r101_dcn_24ep.pth)/[log](https://github.com/zhiqi-li/storage/releases/download/v1.0/bevformer_r101_dcn_24ep.log) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t1-base | 24ep | 42.6 | 35.1 | 23952M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-base-24ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t1-base | 48ep | 43.9 | 35.9 | 23952M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-base-48ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t1 | 24ep | 45.3 | 38.1 | 37579M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-24ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t1 | 48ep | 46.5 | 39.5 | 37579M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-48ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t2 | 24ep | 51.8 | 42.0 | 38954M |[config](projects/configs/bevformerv2/bevformerv2-r50-t2-24ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t2 | 48ep | 52.6 | 43.1 | 38954M |[config](projects/configs/bevformerv2/bevformerv2-r50-t2-48ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | -| [R50](https://pan.baidu.com/s/1Jh5Aq2YwcD6tdj7Sl5BB3g?pwd=5rij) | BEVformerV2-t8 | 24ep | 55.3 | 46.0 | 40392M |[config](projects/configs/bevformerv2/bevformerv2-r50-t8-24ep.py) | [model/log](https://pan.baidu.com/s/1ynzlAt1DQbH8NkqmisatTw?pwd=fdcv) | +| Backbone | Method | Lr Schd | NDS| mAP|memroy | Config | +| :---: | :---: | :---: | :---: | :---:|:---:| :---: | +| R50 | BEVFormer-tiny_fp16 | 24ep | 35.9|25.7 | - |[config](projects/configs/bevformer_fp16/bevformer_tiny_fp16.py) | +| R50 | BEVFormer-tiny | 24ep | 35.4|25.2 | 6500M |[config](projects/configs/bevformer/bevformer_tiny.py) | +| [R101-DCN] | BEVFormer-small | 24ep | 47.9|37.0 | 10500M |[config](projects/configs/bevformer/bevformer_small.py) | +| [R101-DCN] | BEVFormer-base | 24ep | 51.7|41.6 |28500M |[config](projects/configs/bevformer/bevformer_base.py) | +| [R50] | BEVformerV2-t1-base | 24ep | 42.6 | 35.1 | 23952M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-base-24ep.py) | +| [R50] | BEVformerV2-t1-base | 48ep | 43.9 | 35.9 | 23952M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-base-48ep.py) | +| [R50] | BEVformerV2-t1 | 24ep | 45.3 | 38.1 | 37579M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-24ep.py) | +| [R50] | BEVformerV2-t1 | 48ep | 46.5 | 39.5 | 37579M |[config](projects/configs/bevformerv2/bevformerv2-r50-t1-48ep.py) | +| [R50] | BEVformerV2-t2 | 24ep | 51.8 | 42.0 | 38954M |[config](projects/configs/bevformerv2/bevformerv2-r50-t2-24ep.py) | +| [R50] | BEVformerV2-t2 | 48ep | 52.6 | 43.1 | 38954M |[config](projects/configs/bevformerv2/bevformerv2-r50-t2-48ep.py) | +| [R50] | BEVformerV2-t8 | 24ep | 55.3 | 46.0 | 40392M |[config](projects/configs/bevformerv2/bevformerv2-r50-t8-24ep.py) | # Catalog - [ ] BEVFormerV2 HyperQuery diff --git a/PyTorch/built-in/autonoumous_driving/MatrixVT/public_address_statement.md b/PyTorch/built-in/autonoumous_driving/MatrixVT/public_address_statement.md index aaca5bef0fdce9cc0420a6b38b41593b1cadd033..162b202794418df19aa2520f579cf047a87f69ea 100644 --- a/PyTorch/built-in/autonoumous_driving/MatrixVT/public_address_statement.md +++ b/PyTorch/built-in/autonoumous_driving/MatrixVT/public_address_statement.md @@ -1,11 +1,5 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:-------------------------:|:---------------------------------------------------------------------------------------------:|:--------------------:|:-----------------:| -| 开源代码引入 | \BEVDepth\bevdepth\models | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\models\fusion_bev_depth.py | https://arxiv.org/abs/2112.11790 | BEVDepth的论文链接 | -| 开源代码引入 | \bevdepth\evaluators | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\evaluators\det_evaluators.py | https://github.com/open-mmlab/mmdetection3d/issues/449 | 评估nusence数据集的方法参考 | -| 开源代码引入 | \bevdepth\callbacks | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\callbacks\ema.py | https://github.com/rwightman/ | EMA的定义 | -| 开源代码引入 | \bevdepth\callbacks | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\callbacks\ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ | 参考实现 | -| 开源代码引入 |\bevdepth\layers\backbones\ | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\layers\backbones\bevstereo_lss_fpn.py | https://github.com/nv-tlabs/lift-splat-shoot | 传参确认 | -| 开源代码引入 | \bevdepth\layers\backbones\ | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\layers\backbones\base_lss_fpn.py | https://github.com/nv-tlabs/lift-splat-shoot | 参考实现 | -| 开源代码引入 | \bevdepth\layers\heads\ | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\layers\heads\bev_depth_head.py | https://github.com/open-mmlab/mmdetection3d/blob/master/mmdet3d/models/dense_heads/centerpoint_head.py | 功能继承该源码 | -| 开源代码引入 | \bevdepth\models\ | \PyTorch\built-in\autonoumous_driving\BEVDepth\bevdepth\models\bev_stereo.py | https://arxiv.org/abs/2209.10248 | BEVStereo的论文链接 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/MatrixVT/code_for_change/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/MatrixVT/code_for_change/training_epoch_loop.py | https://pytorch-lightning.readthedocs.io/en/stable/advanced/fault_tolerant_training.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/MatrixVT/setup.py | liyinhao@megvii.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/autonoumous_driving/OpenPCDet/public_address_statement.md b/PyTorch/built-in/autonoumous_driving/OpenPCDet/public_address_statement.md index 45ca0cc5d0419c74ef6fac198c5c7641306624e0..8b3b6edd68090747baee567fcf8d93ebc5b3c7ed 100644 --- a/PyTorch/built-in/autonoumous_driving/OpenPCDet/public_address_statement.md +++ b/PyTorch/built-in/autonoumous_driving/OpenPCDet/public_address_statement.md @@ -1,54 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 地址/公网URL地址/域名/邮箱 | 用途说明 | -| ------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | --------------------------- | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/setup.py | OpenPCDet/setup.py | [shaoshuaics@gmail.com](mailto:shaoshuaics@gmail.com) | 原仓作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/kitti/kitti_object_eval_python/rotate_iou.py | OpenPCDet/pcdet/datasets/kitti/kitti_object_eval_python/rotate_iou.py | [scrim@foxmail.com](mailto:scrim@foxmail.com) | rotate_iou源码作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/tools/train_utils/optimization/fastai_optim.py | OpenPCDet/tools/train_utils/optimization/fastai_optim.py | https://github.com/traveller59/second.pytorch | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/tools/train_utils/optimization/learning_schedules_fastai.py | OpenPCDet/tools/train_utils/optimization/learning_schedules_fastai.py | https://github.com/traveller59/second.pytorch | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/transform_utils.py | OpenPCDet/pcdet/utils/transform_utils.py | https://arxiv.org/pdf/2005.13423.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/loss_utils.py | OpenPCDet/pcdet/utils/loss_utils.py | https://www.tensorflow.org/api_docs/python/tf/nn/sigmoid_cross_entropy_with_logits | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/loss_utils.py | OpenPCDet/pcdet/utils/loss_utils.py | https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/smooth_l1_loss.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/loss_utils.py | OpenPCDet/pcdet/utils/loss_utils.py | https://github.com/tianweiy/CenterPoint | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/loss_utils.py | OpenPCDet/pcdet/utils/loss_utils.py | https://arxiv.org/abs/1808.01244 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/loss_utils.py | OpenPCDet/pcdet/utils/loss_utils.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/common_utils.py | OpenPCDet/pcdet/utils/common_utils.py | https://github.com/open-mmlab/mmdetection | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/box_utils.py | OpenPCDet/pcdet/utils/box_utils.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/utils/box_utils.py | OpenPCDet/pcdet/utils/box_utils.py | https://github.com/agent-sgs/PillarNet/blob/master/det3d/core/utils/center_utils.py | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/roiaware_pool3d/src/roiaware_pool3d.cpp | OpenPCDet/pcdet/ops/roiaware_pool3d/src/roiaware_pool3d.cpp | https://arxiv.org/abs/1907.03670 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/pointnet2/pointnet2_stack/src/vector_pool_gpu.cu | OpenPCDet/pcdet/ops/pointnet2/pointnet2_stack/src/vector_pool_gpu.cu | https://arxiv.org/abs/2102.00463 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/pointnet2/pointnet2_stack/src/vector_pool.cpp | OpenPCDet/pcdet/ops/pointnet2/pointnet3_stack/src/vector_pool_gpu.h | https://arxiv.org/abs/2102.00463 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/pointnet2/pointnet2_stack/src/vector_pool_gpu.h | OpenPCDet/pcdet/ops/pointnet2/pointnet4_stack/src/vector_pool_gpu.cpp | https://arxiv.org/abs/2102.00463 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/lyft/lyft_utils.py | OpenPCDet/pcdet/datasets/lyft/lyft_utils.py | https://github.com/poodarchu/Det3D | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/lyft/lyft_mAP_eval/lyft_eval.py | OpenPCDet/pcdet/datasets/lyft/lyft_mAP_eval/lyft_eval.py | https://github.com/lyft/nuscenes-devkit.git | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/waymo/waymo_eval.py | OpenPCDet/pcdet/datasets/waymo/waymo_eval.py | https://github.com/open-mmlab/OpenPCDet | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/waymo/waymo_utils.py | OpenPCDet/pcdet/datasets/waymo/waymo_utils.py | https://github.com/open-mmlab/OpenPCDet | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/waymo/waymo_dataset.py | OpenPCDet/pcdet/datasets/waymo/waymo_dataset.py | https://github.com/open-mmlab/OpenPCDet | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/nuscenes/nuscenes_dataset.py | OpenPCDet/pcdet/datasets/nuscenes/nuscenes_dataset.py | https://arxiv.org/abs/1908.09492 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/nuscenes/nuscenes_utils.py | OpenPCDet/pcdet/datasets/nuscenes/nuscenes_utils.py | [https://github.com/traveller59/second.pytorch ](https://github.com/traveller59/second.pytorch) | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/nuscenes/nuscenes_utils.py | OpenPCDet/pcdet/datasets/nuscenes/nuscenes_utils.py | https://github.com/poodarchu/Det3D | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/once/once_eval/iou_utils.py | OpenPCDet/pcdet/datasets/once/once_eval/iou_utils.py | https://github.com/hongzhenwang/RRPN-revise | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/once/once_eval/iou_utils.py | OpenPCDet/pcdet/datasets/once/once_eval/iou_utils.py | https://github.com/hongzhenwang/RRPN-revise/tree/master/pcdet/rotation | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/kitti/kitti_object_eval_python/rotate_iou.py | OpenPCDet/pcdet/datasets/kitti/kitti_object_eval_python/rotate_iou.py | https://github.com/hongzhenwang/RRPN-revise | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/kitti/kitti_object_eval_python/rotate_iou.py | OpenPCDet/pcdet/datasets/kitti/kitti_object_eval_python/rotate_iou.py | https://github.com/hongzhenwang/RRPN-revise/tree/master/pcdet/rotation | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/argo2/argo2_utils/so3.py | OpenPCDet/pcdet/datasets/argo2/argo2_utils/so3.py | https://en.wikipedia.org/wiki/Conversion_between_quaternions_and_Euler_angles#Source_code_2 | 参考资料地址 | -| 开源代码引入 | [https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/argo2/argo2_utils/so4.py](https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/datasets/argo2/argo2_utils/so3.py) | OpenPCDet/pcdet/datasets/argo2/argo2_utils/so4.py | https://en.wikipedia.org/wiki/Conversion_between_quaternions_and_Euler_angles#Source_code_2 | 参考资料地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/view_transforms/depth_lss.py | OpenPCDet/pcdet/models/view_transforms/depth_lss.py | https://github.com/mit-han-lab/bevfusion/ | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/dense_heads/target_assigner/atss_target_assigner.py | OpenPCDet/pcdet/models/dense_heads/target_assigner/atss_target_assigner.py | https://arxiv.org/abs/1912.02424 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/dense_heads/point_intra_part_head.py | OpenPCDet/pcdet/models/dense_heads/point_intra_part_head.py | https://arxiv.org/abs/1907.03670 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/dense_heads/transfusion_head.py | OpenPCDet/pcdet/models/dense_heads/transfusion_head.py | https://github.com/mit-han-lab/bevfusion/ | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/dense_heads/point_head_simple.py | OpenPCDet/pcdet/models/dense_heads/point_head_simple.py | https://arxiv.org/abs/1912.13192 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/dense_heads/point_head_box.py | OpenPCDet/pcdet/models/dense_heads/point_head_box.py | https://arxiv.org/abs/1812.04244 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/centernet_utils.py | OpenPCDet/pcdet/models/model_utils/centernet_utils.py | https://github.com/tianweiy/CenterPoint | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://github.com/open-mmlab/mmdetection/blob/ecac3a77becc63f23d9f6980b2a36f86acd00a8a/mmdet/models/layers/transformer/utils.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 参考文档地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/init.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/init.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/model_utils/swin_utils.py | OpenPCDet/pcdet/models/model_utils/swin_utils.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/backbones_image/swin.py | OpenPCDet/pcdet/models/backbones_image/swin.py | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/backbones/swin.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/backbones_image/swin.py | OpenPCDet/pcdet/models/backbones_image/swin.py | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/backbones/swin.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/backbones_image/swin.py | OpenPCDet/pcdet/models/backbones_image/swin.py | https://arxiv.org/abs/2103.14030 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/backbones_image/img_neck/generalized_lss.py | OpenPCDet/pcdet/models/backbones_image/img_neck/generalized_lss.py | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/necks/fpn.py | 参考代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/backbones_3d/dsvt.py | OpenPCDet/pcdet/models/backbones_3d/dsvt.py | https://arxiv.org/abs/2301.06051 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/models/backbones_3d/spconv_unet.py | OpenPCDet/pcdet/models/backbones_3d/spconv_unet.py | https://arxiv.org/abs/1907.03670 | 参考论文地址 | -| 配置文件引入 | | OpenPCDet/tools/ckpt_config.json | https://download.pytorch.org/models/ | 从pytorch官网下载ckpt的地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------|--------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/OpenPCDet/setup.py | shaoshuaics@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/OpenPCDet/tools/ckpt_config.json | https://download.pytorch.org/models/ | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/autonoumous_driving/SurroundOcc/public_address_statement.md b/PyTorch/built-in/autonoumous_driving/SurroundOcc/public_address_statement.md index 7792b06e3b862dc375e28e3bfb8c29e90cdf2c72..ea781b264e61bf0b466074728abc192d124d4fa3 100644 --- a/PyTorch/built-in/autonoumous_driving/SurroundOcc/public_address_statement.md +++ b/PyTorch/built-in/autonoumous_driving/SurroundOcc/public_address_statement.md @@ -1,40 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ---------------- | -| 开源代码引入 | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | patch\mmcv\optimizer.py | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/hooks/optimizer.py | 代码实现参考链接 | -| 开源代码引入 | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler. | patch\mmcv\optimizer.py | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/hooks/optimizer.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/1710.03740 | patch\mmcv\optimizer.py | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/hooks/optimizer.py | 代码实现参考链接 | -| 开源代码引入 | https://en.wikipedia.org/wiki/Cross-correlation | patch\torch\conv.py | https://github.com/pytorch/pytorch/blob/v1.11.0/torch/nn/modules/conv.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/vdumoulin/conv_arithmetic/blob/master/README.md | patch\torch\conv.py | https://github.com/pytorch/pytorch/blob/v1.11.0/torch/nn/modules/conv.py | 代码实现参考链接 | -| 开源代码引入 | https://www.matthewzeiler.com/mattzeiler/deconvolutionalnetworks.pdf | patch\torch\conv.py | https://github.com/pytorch/pytorch/blob/v1.11.0/torch/nn/modules/conv.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.runner.LoggerHook | projects\configs\_base_\default_runtime.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/default_runtime.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\kitti-3d-3class.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/kitti-3d-3class.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\kitti-3d-car.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/kitti-3d-car.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\lyft-3d.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/lyft-3d.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\nus-3d.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/nus-3d.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\range100_lyft-3d.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/range100_lyft-3d.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\waymoD5-3d-3class.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/waymoD5-3d-3class.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\\_base_\datasets\waymoD5-3d-car.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/datasets/waymoD5-3d-car.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/traveller59/second.pytorch/blob/3aba19c9688274f75ebb5e576f65cfe54773c021/torchplus/train/learning_schedules_fastai.py | projects\configs\\_base_\schedules\cyclic_40e.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/schedules/cyclic_40e.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/f48241a65aebfe07db122e9db320c31b685dc674/mmcv/runner/hooks/lr_updater.py | projects\configs\\_base_\schedules\cyclic_40e.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/schedules/cyclic_40e.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/f48241a65aebfe07db122e9db320c31b685dc674/mmcv/runner/hooks/momentum_updater.py | projects\configs\\_base_\schedules\cyclic_40e.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/_base_/schedules/cyclic_40e.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\datasets\custom_lyft-3d.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/datasets/custom_lyft-3d.py | 代码实现参考链接 | -| 开源代码引入 | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | projects\configs\datasets\custom_waymo-3d.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/configs/datasets/custom_waymo-3d.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/caizhongang/waymo_kitti_converter | projects\mmdet3d_plugin\core\evaluation\kitti2waymo.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/core/evaluation/kitti2waymo.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/pytorch/pytorch/issues/973 | projects\mmdet3d_plugin\datasets\builder.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/datasets/builder.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/1412.6980 | projects\mmdet3d_plugin\models\opt\adamw.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/models/opt/adamw.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/1711.05101 | projects\mmdet3d_plugin\models\opt\adamw.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/models/opt/adamw.py | 代码实现参考链接 | -| 开源代码引入 | https://openreview.net/forum?id=ryQu7f-RZ | projects\mmdet3d_plugin\models\opt\adamw.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/models/opt/adamw.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/2002.04745 | projects\mmdet3d_plugin\surroundocc\modules\custom_base_transformer_layer.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/surroundocc/modules/custom_base_transformer_layer.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/pdf/2010.04159.pdf | projects\mmdet3d_plugin\surroundocc\modules\spatial_cross_attention.py | https://github.com/weiyithu/SurroundOcc/blob/main/projects/mmdet3d_plugin/surroundocc/modules/spatial_cross_attention.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/1706.02677 | PyTorch\built-in\autonoumous_driving\SurroundOcc\tools\train.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/train.py | 代码实现参考链接 | -| 开源代码引入 | https://mmdetection3d.readthedocs.io/en/latest/tutorials/customize_runtime.html#customize-workflow | PyTorch\built-in\autonoumous_driving\SurroundOcc\tools\train.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/train.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/charlesq34/pointnet2/blob/master/scannet/scannet_dataset.py#L24 | tools\data_converter\indoor_converter.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/data_converter/indoor_converter.py | 代码实现参考链接 | -| 开源代码引入 | https://www.kaggle.com/c/3d-object-detection-for-autonomous-vehicles/discussion/110000 | tools\data_converter\lyft_data_fixer.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/data_converter/lyft_data_fixer.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/2006.12356 | tools\data_converter\s3dis_data_utils.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/data_converter/s3dis_data_utils.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/charlesq34/pointnet2/blob/master/scannet/scannet_dataset.py#L24 | tools\data_converter\s3dis_data_utils.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/data_converter/s3dis_data_utils.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/charlesq34/pointnet2/blob/master/scannet/scannet_dataset.py#L24 | tools\data_converter\scannet_data_utils.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/data_converter/scannet_data_utils.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/caizhongang/waymo_kitti_converter | tools\data_converter\waymo_converter.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/data_converter/waymo_converter.py | 代码实现参考链接 | -| 开源代码引入 | https://arxiv.org/abs/1706.02677 | tools\fp16\train.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/fp16/train.py | 代码实现参考链接 | -| 开源代码引入 | https://mmdetection3d.readthedocs.io/en/latest/tutorials/customize_runtime.html#customize-workflow | tools\fp16\train.py | https://github.com/weiyithu/SurroundOcc/blob/main/tools/fp16/train.py | 代码实现参考链接 | -| | | | | | - +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|-----------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/SurroundOcc/patch/torch/conv.py | https://www.matthewzeiler.com/mattzeiler/deconvolutionalnetworks.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/SurroundOcc/patch/torch/conv.py | https://www.matthewzeiler.com/mattzeiler/deconvolutionalnetworks.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/SurroundOcc/patch/torch/conv.py | https://www.matthewzeiler.com/mattzeiler/deconvolutionalnetworks.pdf | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/autonoumous_driving/UniAD/public_address_statement.md b/PyTorch/built-in/autonoumous_driving/UniAD/public_address_statement.md index a04d286115939bee6eb7603bd3aa10b1b68a3b79..14a85c6a49696ee057498a038fedaef205bd6cde 100644 --- a/PyTorch/built-in/autonoumous_driving/UniAD/public_address_statement.md +++ b/PyTorch/built-in/autonoumous_driving/UniAD/public_address_statement.md @@ -1,31 +1,4 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:--------------------------------------------------------------------------------------------------------------------------------------:|:------------------------------------------------------------------------------------------:|:--------------------:|:-------------------------:| -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/models/opt/adamw.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/models/opt/adamw.py | https://arxiv.org/abs/1412.6980 | Adam的论文链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/models/opt/adamw.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/models/opt/adamw.py | https://arxiv.org/abs/1711.05101 | weight decay的论文链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/models/opt/adamw.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/models/opt/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | Adam convergence的论文链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/ops/multi_scale_deform_attn.py | /PyTorch/built-in/autonoumous_driving/UniAD/mmcv_need/multi_scale_deform_attn.py | https://arxiv.org/pdf/2010.04159.pdf | DEFORMABLE DETR的论文链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/epoch_based_runner.py | /PyTorch/built-in/autonoumous_driving/UniAD/mmcv_need/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | mmcv相关pr参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/epoch_based_runner.py | /PyTorch/built-in/autonoumous_driving/UniAD/test/epoch_based_runner_perf.py | https://github.com/open-mmlab/mmcv/pull/1108 | mmcv相关pr参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/hooks/optimizer.py | /PyTorch/built-in/autonoumous_driving/UniAD/mmcv_need/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | clip_grad_norm_fused优化器相关 | -| 开源代码引入 | https://github.com/open-mmlab/mmcv/blob/1.x/mmcv/runner/hooks/optimizer.py | /PyTorch/built-in/autonoumous_driving/UniAD/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | clip_grad_norm_fused优化器相关 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/tools/train.py | /PyTorch/built-in/autonoumous_driving/UniAD/tools/train.py | https://arxiv.org/abs/1706.02677 | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/tools/train.py | /PyTorch/built-in/autonoumous_driving/UniAD/tools/train.py | https://mmdetection3d.readthedocs.io/en/latest/tutorials/customize_runtime.html#customize-workflow | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/losses/occflow_loss.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/losses/occflow_loss.py | https://arxiv.org/abs/1606.04797 | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/datasets/builder.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | dataloader相关issue | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/datasets/eval_utils/eval_utils.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/eval_utils/eval_utils.py | https://www.nuscenes.org/object-detection | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/datasets/eval_utils/map_api.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/eval_utils/map_api.py | https://www.nuscenes.org/download | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/datasets/pipelines/loading.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/fileio/file_client.py | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/datasets/data_utils/trajectory_api.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/data_utils/trajectory_api.py | https://forum.nuscenes.org/t/dimensions-of-the-ego-vehicle-used-to-gather-data/550 | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/modules/spatial_cross_attention.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/modules/spatial_cross_attention.py | https://arxiv.org/pdf/2010.04159.pdf | DEFORMABLE DETR的论文链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/modules/decoder.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/modules/decoder.py | https://arxiv.org/pdf/2010.04159.pdf | DEFORMABLE DETR的论文链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/modules/temporal_self_attention.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/modules/temporal_self_attention.py | https://arxiv.org/pdf/2010.04159.pdf | DEFORMABLE DETR的论文链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/motion_head_plugin/motion_deformable_attn.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/motion_head_plugin/motion_deformable_attn.py | https://arxiv.org/pdf/2010.04159.pdf | DEFORMABLE DETR的论文链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/seg_head_plugin/seg_mask_head.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/seg_head_plugin/seg_mask_head.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-53296self.num_heads956 | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/seg_head_plugin/seg_detr_head.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/seg_head_plugin/seg_detr_head.py | https://arxiv.org/pdf/2005.12872 | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/motion_head_plugin/motion_optimization.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/motion_head_plugin/motion_optimization.py | https://github.com/motional/nuplan-devkit | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/motion_head_plugin/motion_deformable_attn.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/motion_head_plugin/motion_deformable_attn.py | https://arxiv.org/abs/2002.04745 | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/planning_head_plugin/collision_optimization.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/planning_head_plugin/collision_optimization.py | https://github.com/motional/nuplan-devkit | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/mmdet3d_plugin/uniad/dense_heads/panseg_head.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/uniad/dense_heads/panseg_head.py | https://github.com/open-mmlab/mmdetection | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/configs/_base_/default_runtime.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/configs/_base_/default_runtime.py | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.runner.LoggerHook | 代码实现参考链接 | -| 开源代码引入 | https://github.com/OpenDriveLab/UniAD/blob/main/projects/configs/_base_/datasets/nus-3d.py | /PyTorch/built-in/autonoumous_driving/UniAD/projects/configs/_base_/datasets/nus-3d.py | https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient | 代码实现参考链接 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------|-------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/eval_utils/eval_utils.py | https://www.nuscenes.org/object-detection | 代码实现参考链接 | +| ModelZoo-PyTorch/PyTorch/built-in/autonoumous_driving/UniAD/projects/mmdet3d_plugin/datasets/eval_utils/map_api.py | https://www.nuscenes.org/download | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..adeb1ffebaf37cfabcec77599855de71ca181213 --- /dev/null +++ b/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/3035c92b350959924f9f00213499208652fc7ea050643e8b385c2dac08641f02/ViT-L-14-336px.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/5806e77cd80f8b59890b7e101eabd078d9fb84e6937f9e85e4ecb61988df416f/ViT-B-16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/7e526bd135e493cef0776de27d5f42653e6b4c8bf9e0f653bb11773263205fdd/RN50x4.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/52378b407f34354e150460fe41077663dd5b39c54cd0bfd2b27167a4a06ec9aa/RN50x16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/clip/clip.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Beit2_for_PyTorch/vqkd_teacher/dino.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase16_pretrain/dino_vitbase16_pretrain.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/public_address_statement.md index 83b35f8d45459654d83647667262846124574e06..e68e09933d769c3622e44d52ad764fc88e1b7f2e 100644 --- a/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/public_address_statement.md @@ -1,4 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-------------------------------------------|------------------------|--------| -| 开发引入 | / | CRNN_for_PyTorch/LMDB_anycard_config.yaml | 127.0.0.1 | 本机IP地址 | -| 开发引入 | / | CRNN_for_PyTorch/infer/requirements.txt | https://github.com/NVIDIA/dllogger.git | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/LMDB_8p_config.yaml | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/LMDB_anycard_config.yaml | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/LMDB_config.yaml | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/CRNN_for_PyTorch/LMDB_config_pr.yaml | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/public_address_statement.md index 8e357e28935fcce8b2722ec40f8dee5aee074566..f891b1af48b39ab3ae7433f51d0ce788dd3e2272 100644 --- a/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/public_address_statement.md @@ -1,33 +1,23 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------|----------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/eval_image_retrieval.py | https://dl.fbaipublicfiles.com/dino/dino_vitsmall16_googlelandmark_pretrain/dino_vitsmall16_googlelandmark_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/eval_video_segmentation.py | https://raw.githubusercontent.com/Liusifei/UVC/master/libs/data/palette.txt | 下载数据集 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall16_pretrain/dino_deitsmall16_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall8_pretrain/dino_deitsmall8_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase16_pretrain/dino_vitbase16_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase8_pretrain/dino_vitbase8_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_resnet50_pretrain/dino_resnet50_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_small_12_p16_pretrain/dino_xcit_small_12_p16_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_small_12_p8_pretrain/dino_xcit_small_12_p8_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_medium_24_p16_pretrain/dino_xcit_medium_24_p16_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_medium_24_p8_pretrain/dino_xcit_medium_24_p8_pretrain.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/utils.py | https://dl.fbaipublicfiles.com/dino/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/video_generation.py | https://dl.fbaipublicfiles.com/dino/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/visualize_attention.py | https://dl.fbaipublicfiles.com/dino/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git | DINO_ID3436_for_PyTorch/visualize_attention.py | https://dl.fbaipublicfiles.com/dino/img.png | 下载输入数据 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/utils.py | DINO_ID3436_for_PyTorch/utils.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/eval_video_segmentation.py | DINO_ID3436_for_PyTorch/eval_video_segmentation.py | https://github.com/Liusifei/UVC | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/utils.py | DINO_ID3436_for_PyTorch/utils.py | https://github.com/facebookresearch/barlowtwins/blob/main/main.py | 源码实现 | -| 开发引入 | / | DINO_ID3436_for_PyTorch/utils.py | https://beesbuzz.biz/code/16-hsv-color-transforms | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/main_dino.py | DINO_ID3436_for_PyTorch/eval_knn.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/utils.py | DINO_ID3436_for_PyTorch/utils.py | https://github.com/facebookresearch/xcit/blob/master/xcit.py#L404-L405 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/vision_transformer.py | DINO_ID3436_for_PyTorch/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/utils.py | DINO_ID3436_for_PyTorch/utils.py | https://github.com/facebookresearch/detr/blob/master/util/misc.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/README.md | DINO_ID3436_for_PyTorch/eval_copy_detection.py | https://lear.inrialpes.fr/~jegou/data.php#copydays | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/main_dino.py | DINO_ID3436_for_PyTorch/main_dino.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/vision_transformer.py | DINO_ID3436_for_PyTorch/vision_transformer.py | https://github.com/facebookresearch/dino.git/issues/8 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/main_dino.py | DINO_ID3436_for_PyTorch/eval_copy_detection.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/run_with_submitit.py | DINO_ID3436_for_PyTorch/run_with_submitit.py | https://github.com/facebookresearch/deit/blob/main/run_with_submitit.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/main_dino.py | DINO_ID3436_for_PyTorch/eval_image_retrieval.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dino.git/main_dino.py | DINO_ID3436_for_PyTorch/eval_linear.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开发引入 | / | DINO_ID3436_for_PyTorch/main_dino.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/eval_copy_detection.py | https://pytorch.org/docs/stable/distributed.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/eval_image_retrieval.py | https://dl.fbaipublicfiles.com/dino/dino_vitsmall16_googlelandmark_pretrain/dino_vitsmall16_googlelandmark_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/eval_image_retrieval.py | https://pytorch.org/docs/stable/distributed.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/eval_knn.py | https://pytorch.org/docs/stable/distributed.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/eval_linear.py | https://pytorch.org/docs/stable/distributed.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/eval_video_segmentation.py | http://archive.ics.uci.edu/ml/machine-learning-databases/semeion/semeion.data | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_small_12_p8_pretrain/dino_xcit_small_12_p8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_small_12_p16_pretrain/dino_xcit_small_12_p16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_medium_24_p8_pretrain/dino_xcit_medium_24_p8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_medium_24_p16_pretrain/dino_xcit_medium_24_p16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase8_pretrain/dino_vitbase8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase16_pretrain/dino_vitbase16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_resnet50_pretrain/dino_resnet50_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall8_pretrain/dino_deitsmall8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall16_pretrain/dino_deitsmall16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/main_dino.py | https://pytorch.org/docs/stable/distributed.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/utils.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/utils.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/video_generation.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/visualize_attention.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DINO_ID3436_for_PyTorch/visualize_attention.py | https://dl.fbaipublicfiles.com/dino/img.png | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/public_address_statement.md index 248e1508e545380f3cccc8de328362b5ae63e1cb..218b7da33cb89f17e259c87035de7664d457fbe1 100644 --- a/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/public_address_statement.md @@ -1,9 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------------------------|----------------------------------------------|------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/dangweili/pedestrian-attribute-recognition-pytorch.git | DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/dangweili/pedestrian-attribute-recognition-pytorch.git | DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/dangweili/pedestrian-attribute-recognition-pytorch.git | DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/dangweili/pedestrian-attribute-recognition-pytorch.git | DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/dangweili/pedestrian-attribute-recognition-pytorch.git | DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/dangweili/pedestrian-attribute-recognition-pytorch.git | DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开发引入 | / | DeepMar_for_PyTorch/constant.py | 127.0.0.1 | 本机IP地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------|------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/baseline/model/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DeepMar_for_PyTorch/train_deepmar_resnet50_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/public_address_statement.md index 5aaba7b7013a7b83de62c29f3a40d25fc0d8266b..7c62ebbd7bd38dc75b2ed6c9cb0cbe7f4e10d351 100644 --- a/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/public_address_statement.md @@ -1,47 +1,26 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|----------------------------------------------------------------|------------------------------------------------------------------------------|--------| -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/.circleci/config.yml.in | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/.travis.yml | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/docs/Makefile | http://pytorch.org/vision/ | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://docs.python.org/ | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | http://docs.scipy.org/doc/numpy/ | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://fonts.googleapis.com/css?family=Lato | 下载配置 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/torch_stable.html | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/torch_nightly.html | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/packaging/torchvision/meta.yaml | https://github.com/pytorch/vision | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/ | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://github.com/pytorch/vision | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/ | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://www.7-zip.org/a/7z1805-x64.exe | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://dev.azure.com/pytorch | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/z5b7ryz0zrimntl/cuda_9.0.176_windows.7z?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/6p0xyqh472nu8m1/cudnn-9.0-windows7-x64-v7.zip?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/7a4sbq0dln6v7t2/cuda_9.1.85_windows.7z?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/e0prhgsrbyfi4ov/cudnn-9.1-windows7-x64-v7.zip?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/9mcolalfdj4n979/NvToolsExt.7z?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://aka.ms/vs/15/release/vs_buildtools.exe | 下载工具脚本 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | soumith@pytorch.org | 邮箱 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | http://github.com/pytorch/vision/archive/master.zip | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://github.com/pytorch/vision/archive/master.zip | 下载依赖 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | http://github.com/pytorch/vision/archive/this_doesnt_exist.zip | 下载配置 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/vision_tests/io/ | 下载配置 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/modelarts/train_start.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/test/test_cpp_models.py | https://github.com/pytorch/vision/issues/1191 | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/packaging/pkg_helpers.bash | https://github.com/pytorch/pytorch/pull/23408 | 源码实现 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/similarity/loss.py | https://github.com/omoindrot/tensorflow-triplet-loss | 源码实现 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/packaging/windows/internal/auth.bat | https://docs.microsoft.com/en-us/azure/devops/pipelines/build/triggers?tabs=yaml&view=vsts#my-build-didnt-run-what-happened | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/packaging/conda/build_vision.sh | https://github.com/conda/conda-build/issues/3285 | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/classification/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/video_classification/sampler.py | https://github.com/pytorch/pytorch/issues/23430 | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/video_classification/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/similarity/loss.py | https://omoindrot.github.io/triplet-loss | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/packaging/windows/internal/clone.bat | https://github.com/%PYTORCH_REPO%/%MODULE_NAME% | 源码实现 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/video_classification/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/docs/source/conf.py | http://stackoverflow.com/a/41184353/3343043 | 相关说明 | -| 开发引入 | / | DenseNet161_ID0455_for_PyTorch/references/classification/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------|----------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/.circleci/config.yml.in | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/.circleci/config.yml.in | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/.travis.yml | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/packaging/conda/build_vision.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | soumith@pytorch.org | 邮箱地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/torch_nightly.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://www.7-zip.org/a/7z1805-x64.exe | 依赖地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | http://docs.scipy.org/doc/numpy/ | 相关依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/9mcolalfdj4n979/NvToolsExt.7z?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://dev.azure.com/pytorch | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/e0prhgsrbyfi4ov/cudnn-9.1-windows7-x64-v7.zip?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/6p0xyqh472nu8m1/cudnn-9.0-windows7-x64-v7.zip?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://download.pytorch.org/vision_tests/io/ | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/7a4sbq0dln6v7t2/cuda_9.1.85_windows.7z?dl=1 | cuda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://www.dropbox.com/s/z5b7ryz0zrimntl/cuda_9.0.176_windows.7z?dl=1 | cuda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet161_ID0455_for_PyTorch/url.ini | https://fonts.googleapis.com/css?family=Lato | 下载配置 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/public_address_statement.md index 6480719e719002b452e0eda6bd8446f073e5628c..8e61c2848ea078b886f6e94924b0e859c374ebb1 100644 --- a/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/public_address_statement.md @@ -1,207 +1,108 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------|--------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/.travis.yml | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/docs/Makefile | http://pytorch.org/vision/ | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/docs/source/conf.py | https://docs.python.org/ | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/docs/source/conf.py | http://docs.scipy.org/doc/numpy/ | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/docs/source/conf.py | https://fonts.googleapis.com/css?family=Lato | 下载配置文件 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/conda/build_vision.sh | https://github.com/pytorch/vision | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/nightly/${WHEEL_DIR}torch_nightly.html | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/whl/torch_stable.html | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/torch_nightly.html | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/torchvision/meta.yaml | https://github.com/pytorch/vision | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/wheel/linux_manywheel.sh | https://github.com/pytorch/vision | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/wheel/linux_manywheel.sh | https://download.pytorch.org/whl/nightly/$CUVER/torch_nightly.html | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/wheel/osx_wheel.sh | https://github.com/pytorch/vision | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/packaging/wheel/osx_wheel.sh | https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/whl | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://www.7-zip.org/a/7z1805-x64.exe | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://dev.azure.com/pytorch | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/z5b7ryz0zrimntl/cuda_9.0.176_windows.7z?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/6p0xyqh472nu8m1/cudnn-9.0-windows7-x64-v7.zip?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/7a4sbq0dln6v7t2/cuda_9.1.85_windows.7z?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/e0prhgsrbyfi4ov/cudnn-9.1-windows7-x64-v7.zip?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/9mcolalfdj4n979/NvToolsExt.7z?dl=1 | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://aka.ms/vs/15/release/vs_buildtools.exe | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/setup.py | soumith@pytorch.org | 邮箱 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/setup.py | https://github.com/pytorch/vision | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/test/test_datasets_utils.py | http://github.com/pytorch/vision/archive/master.zip | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/test/test_datasets_utils.py | https://github.com/pytorch/vision/archive/master.zip | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/test/test_datasets_utils.py | http://github.com/pytorch/vision/archive/this_doesnt_exist.zip | 下载依赖 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/vision_tests/io/ | 下载配置 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech101/101_ObjectCategories.tar.gz | 下载数据集 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech101/Annotations.tar | 下载数据集 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech256/256_ObjectCategories.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/hmdb51.py | https://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_train.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_val.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_devkit_t12.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-labels-idx2-int.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-labels-idx2-int.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-images-idx3-ubyte.xz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-labels-idx2-int.xz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/t10k-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/t10k-labels-idx1-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-labels-idx1-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/train-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/train-labels-idx1-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/t10k-images-idx3-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/t10k-labels-idx1-ubyte.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://cloudstor.aarnet.edu.au/plus/index.php/s/54h3OuGJhFLwAlQ/download | 下载数据集 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://github.com/brendenlake/omniglot/raw/master/python | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/notredame_harris.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/yosemite_harris.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/liberty_harris.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/notredame.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/yosemite.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/liberty.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/sbd.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/sbd.py | http://home.bharathh.info/pubs/codes/SBD/train_noval.txt | 下载数据集 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | http://www.cs.virginia.edu/~vicente/sbucaptions/SBUCaptionedPhotoDataset.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/semeion.py | http://archive.ics.uci.edu/ml/machine-learning-databases/semeion/semeion.data | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/stl10.py | http://ai.stanford.edu/~acoates/stl10/stl10_binary.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/train_32x32.mat | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/test_32x32.mat | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/extra_32x32.mat | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/usps.py | https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/usps.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/usps.py | https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/usps.t.bz2 | 下载数据集 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://docs.google.com/uc?export=download | 下载链接 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2011/VOCtrainval_25-May-2011.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2010/VOCtrainval_03-May-2010.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2009/VOCtrainval_11-May-2009.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2008/VOCtrainval_14-Jul-2008.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/detection/faster_rcnn.py | https://download.pytorch.org/models/fasterrcnn_resnet50_fpn_coco-258fb6c6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/detection/keypoint_rcnn.py | https://download.pytorch.org/models/keypointrcnn_resnet50_fpn_coco-9f466800.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/detection/mask_rcnn.py | https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/googlenet.py | https://download.pytorch.org/models/googlenet-1378be20.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/mobilenet.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16-397923af.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r3d_18-b3b3357e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/mc3_18-a90a0ba3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r2plus1d_18-91a641e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/datasets/samplers/clip_sampler.py | DenseNet169_ID0454_for_PyTorch/references/video_classification/sampler.py | https://github.com/pytorch/pytorch/issues/23430 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/densenet.py | DenseNet169_ID0454_for_PyTorch/torchvision/csrc/models/densenet.h | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/resource/hmdb-a-large-human-motion-database/ | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/alexnet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/alexnet.py | https://arxiv.org/abs/1404.5997 | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://github.com/rois-codh/kmnist | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/datasets/phototour.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://phototour.cs.washington.edu/patches/default.htm | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/references/classification/train_mp.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/prototype/datasets/_builtin/sbd.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/sbd.py | http://home.bharathh.info/pubs/codes/SBD/download.html | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/resnet.py | DenseNet169_ID0454_for_PyTorch/torchvision/csrc/models/resnet.h | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/segmentation/deeplabv3.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/segmentation/deeplabv3.py | https://arxiv.org/abs/1706.05587 | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/packaging/pkg_helpers.bash | https://github.com/pytorch/pytorch/pull/23408 | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/ops/misc.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/_utils.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/mobilenet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/packaging/conda/build_vision.sh | https://github.com/conda/conda-build/issues/3285 | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/utils.py | https://gist.github.com/anonymous/bf16430f7750c023141c562f3e9f2a91 | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/modelarts/train_start.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/references/similarity/loss.py | DenseNet169_ID0454_for_PyTorch/references/similarity/loss.py | https://omoindrot.github.io/triplet-loss | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/video/resnet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://arxiv.org/abs/1711.11248 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/datasets/utils.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/utils.py | https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/semeion.py | http://archive.ics.uci.edu/ml/datasets/semeion+handwritten+digit | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/flickr.py | http://nlp.cs.illinois.edu/HockenmaierGroup/8k-pictures.html | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/quantization/inception.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/inception.py | http://arxiv.org/abs/1512.00567 | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/docs/make.bat | DenseNet169_ID0454_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/squeezenet.py | DenseNet169_ID0454_for_PyTorch/torchvision/csrc/models/squeezenet.h | https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1 | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/lsun.py | http://lsun.cs.princeton.edu | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/transforms/v2/_geometry.py | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/functional.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://www.westernsydney.edu.au/bens/home/reproducible_research/emnist | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/packaging/windows/internal/auth.bat | https://docs.microsoft.com/en-us/azure/devops/pipelines/build/triggers?tabs=yaml&view=vsts#my-build-didnt-run-what-happened | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/quantization/googlenet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/googlenet.py | http://arxiv.org/abs/1409.4842 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/transforms/v2/_type_conversion.py | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/functional.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#concept-modes | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/squeezenet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/squeezenet.py | https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1 | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/test/test_cpp_models.py | https://github.com/pytorch/vision/issues/1191 | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git/references/similarity/loss.py | DenseNet169_ID0454_for_PyTorch/references/similarity/loss.py | https://github.com/omoindrot/tensorflow-triplet-loss | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/densenet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/docs/source/conf.py | DenseNet169_ID0454_for_PyTorch/docs/source/conf.py | http://stackoverflow.com/a/41184353/3343043 | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/flickr.py | http://web.engr.illinois.edu/~bplumme2/Flickr30kEntities/ | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/references/classification/train_mp.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/docs/source/models/squeezenet.rst | DenseNet169_ID0454_for_PyTorch/torchvision/models/squeezenet.py | https://arxiv.org/abs/1602.07360 | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/kinetics.py | https://deepmind.com/research/open-source/open-source-datasets/kinetics/ | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/io/video.py | https://github.com/mikeboers/PyAV#installation | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/quantization/mobilenetv2.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/mobilenet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/prototype/datasets/_builtin/mnist.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://github.com/zalandoresearch/fashion-mnist | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/packaging/windows/internal/clone.bat | https://github.com/%PYTORCH_REPO%/%MODULE_NAME% | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/densenet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/sbd.py | https://docs.scipy.org/doc/ | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/io/video.py | DenseNet169_ID0454_for_PyTorch/torchvision/io/video.py | https://github.com/FFmpeg/FFmpeg/commit/d5a21172283572af587b3d939eba0091484d3263 | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/mnasnet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 预训练模型 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/models/mnasnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/resnet.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/shufflenetv2.py | DenseNet169_ID0454_for_PyTorch/torchvision/models/shufflenetv2.py | https://arxiv.org/abs/1807.11164 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/docs/source/models/squeezenet.rst | DenseNet169_ID0454_for_PyTorch/torchvision/csrc/models/squeezenet.h | https://arxiv.org/abs/1602.07360 | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/coco.py | http://mscoco.org/dataset/#detections-challenge2016 | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/datasets/folder.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/folder.py | https://github.com/python-pillow/Pillow/issues/835 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/transforms/v2/_geometry.py | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/transforms.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/prototype/datasets/_builtin/mnist.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://github.com/facebookresearch/qmnist | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/references/classification/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/datasets/cityscapes.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/ucf101.py | https://www.crcv.ucf.edu/data/UCF101.php | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/transforms/functional.py | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/functional.py | https://en.wikipedia.org/wiki/Gamma_correction | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/coco.py | http://mscoco.org/dataset/#captions-challenge2015 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/transforms/v2/_type_conversion.py | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/transforms.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#concept-modes | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/transforms.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/ops/boxes.py | DenseNet169_ID0454_for_PyTorch/torchvision/ops/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/references/video_classification/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/references/classification/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/celeba.py | http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/prototype/datasets/_builtin/cifar.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar.html | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/transforms/functional.py | DenseNet169_ID0454_for_PyTorch/torchvision/transforms/functional.py | https://en.wikipedia.org/wiki/Hue | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/references/video_classification/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/ops/poolers.py | DenseNet169_ID0454_for_PyTorch/torchvision/ops/feature_pyramid_network.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/prototype/datasets/_builtin/usps.py | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/usps.py | https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass.html#usps | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cityscapes.py | http://www.cityscapes-dataset.com/ | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/stl10.py | https://cs.stanford.edu/~acoates/stl10/ | 相关说明 | -| 开发引入 | / | DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://image-net.org/ | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/alexnet.py | DenseNet169_ID0454_for_PyTorch/torchvision/csrc/models/alexnet.h | https://arxiv.org/abs/1404.5997 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/quantization/inception.py | DenseNet169_ID0454_for_PyTorch/torchvision/csrc/models/inception.h | http://arxiv.org/abs/1512.00567 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|----------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/.travis.yml | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh -O miniconda.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/nightly/${WHEEL_DIR}torch_nightly.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/packaging/wheel/linux_manywheel.sh | https://download.pytorch.org/whl/nightly/$CUVER/torch_nightly.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/packaging/wheel/osx_wheel.sh | https://download.pytorch.org/whl/nightly/cpu/torch_nightly.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/setup.py | soumith@pytorch.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/hmdb51.py | https://raw.githubusercontent.com/Liusifei/UVC/master/libs/data/palette.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_devkit_t12.tar.gz | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_train.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/imagenet.py | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_val.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/train-labels-idx1-ubyte.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/train-images-idx3-ubyte.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-labels-idx1-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/train-labels-idx1-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/train-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/t10k-labels-idx1-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | http://codh.rois.ac.jp/kmnist/dataset/kmnist/t10k-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-labels-idx2-int.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-labels-idx2-int.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://cloudstor.aarnet.edu.au/plus/index.php/s/54h3OuGJhFLwAlQ/download | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-images-idx3-ubyte.xz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-labels-idx2-int.xz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/yosemite_harris.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/notredame_harris.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/liberty_harris.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/yosemite.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/notredame.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/liberty.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/sbd.py | http://home.bharathh.info/pubs/codes/SBD/train_noval.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/sbd.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/semeion.py | https://raw.githubusercontent.com/salesforce/awd-lstm-lm/master/data/enwik8/prep_enwik8.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/stl10.py | http://ai.stanford.edu/~acoates/stl10/stl10_binary.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/train_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/test_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/extra_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2008/VOCtrainval_14-Jul-2008.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2009/VOCtrainval_11-May-2009.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2010/VOCtrainval_03-May-2010.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2011/VOCtrainval_25-May-2011.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/detection/faster_rcnn.py | https://download.pytorch.org/models/fasterrcnn_resnet50_fpn_coco-258fb6c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/detection/keypoint_rcnn.py | https://download.pytorch.org/models/keypointrcnn_resnet50_fpn_coco-9f466800.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/detection/mask_rcnn.py | https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/googlenet.py | https://download.pytorch.org/models/googlenet-1378be20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/mobilenet.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r3d_18-b3b3357e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r2plus1d_18-91a641e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/mc3_18-a90a0ba3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/whl/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly/torch_nightly.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://www.7-zip.org/a/7z1805-x64.exe | 依赖地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | http://www.cs.virginia.edu/~vicente/sbucaptions/SBUCaptionedPhotoDataset.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/whl/ | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/9mcolalfdj4n979/NvToolsExt.7z?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://dev.azure.com/pytorch | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech101/101_ObjectCategories.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/e0prhgsrbyfi4ov/cudnn-9.1-windows7-x64-v7.zip?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/6p0xyqh472nu8m1/cudnn-9.0-windows7-x64-v7.zip?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech256/256_ObjectCategories.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech256/256_ObjectCategories.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://download.pytorch.org/vision_tests/io/ | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/7a4sbq0dln6v7t2/cuda_9.1.85_windows.7z?dl=1 | cuda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet169_ID0454_for_PyTorch/url.ini | https://www.dropbox.com/s/z5b7ryz0zrimntl/cuda_9.0.176_windows.7z?dl=1 | cuda下载链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/public_address_statement.md index 6a6abe14097ce1b0c501ecb7ab99e52c00e84502..e4d4168dabc6e37ca6a4f7f1d63111ee1ab34861 100644 --- a/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/public_address_statement.md @@ -1,12 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------|--------------------------------------------|--------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/pytorch/vision.git | DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载预训练模型 | -| 开发引入 | / | DenseNet201_ID0453_for_PyTorch/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet201_ID0453_for_PyTorch/modelarts/train_start.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | DenseNet201_ID0453_for_PyTorch/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | DenseNet201_ID0453_for_PyTorch/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/densenet.py | DenseNet201_ID0453_for_PyTorch/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision.git/torchvision/models/densenet.py | DenseNet201_ID0453_for_PyTorch/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|--------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/DenseNet201_ID0453_for_PyTorch/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/public_address_statement.md index ea6d0b5cea1b2a66912b12266e7c836596befeee..329341e22d53a7b6cc9ce6831428222bd6ebe574 100644 --- a/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/public_address_statement.md @@ -1,14 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-------------------------------------------|--------------------------------------------------------------|---------| -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_2_2.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_2_2.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_2_2.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_2_2.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_5_0.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_5_0.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_5_0.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_5_0.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载预训练模型 | -| 开发引入 | / | Densenet121_for_PyTorch/sdk_infer/sdk_run_infer/models/densenet121/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb3a | 相关说明 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_5_0.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_2_2.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开发引入 | / | Densenet121_for_PyTorch/densenet_0_5_0.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------|--------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/densenet121_1p_main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/url.ini | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/url.ini | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/url.ini | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Densenet121_for_PyTorch/url.ini | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/public_address_statement.md index cc84ccddc6b8b8b0cd1728b1b3b2c0d8bb1e32c1..9ee286714c30a5bed2865103debf52558a3c2461 100644 --- a/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/public_address_statement.md @@ -1,33 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B1_ID1713_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/setup.py | EfficientNet-B1_ID1713_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B1_ID1713_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1704.04861 | 论文地址 | -| 开发引入 | / | EfficientNet-B1_ID1713_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B1_ID1713_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/public_address_statement.md index 6405f9874796fa674de991e311e195bf865d7a2d..f25d000a772f1c0f6906f4565950a108f18fd799 100644 --- a/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/public_address_statement.md @@ -1,33 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B2_ID1714_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B2_ID1714_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/setup.py | EfficientNet-B2_ID1714_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1704.04861 | 论文地址 | -| 开发引入 | / | EfficientNet-B2_ID1714_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B2_ID1714_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/public_address_statement.md index 8734d4452eae7f031e8253b5337b5093f9eb5cee..2ad74395328d4edb4d78003fe64f50a8d23990db 100644 --- a/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/public_address_statement.md @@ -1,33 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B3_ID0450_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1704.04861 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B3_ID0450_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B3_ID0450_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/setup.py | EfficientNet-B3_ID0450_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B3_ID0450_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/public_address_statement.md index b5c9c3f416ab3d2003091d94a34a7222754ee3d2..788e8aa7cc33d39d511022df9ec683490d5286a6 100644 --- a/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/public_address_statement.md @@ -1,33 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B4_ID1632_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1704.04861 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/setup.py | EfficientNet-B4_ID1632_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B4_ID1632_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | EfficientNet-B4_ID1632_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B4_ID1632_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/public_address_statement.md index 9fb8015503bc9a55da5b11a43ee31321429bf7bc..c4c75afd5e2d28a667ed82fde46496d4392d2a4b 100644 --- a/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/public_address_statement.md @@ -1,33 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/Efficien tNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet-B5_ID1633_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/setup.py | EfficientNet-B5_ID1633_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1704.04861 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet-B5_ID1633_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | EfficientNet-B5_ID1633_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet-B5_ID1633_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/public_address_statement.md index 8250fcfe47943280b57b9b7efc615111a17625d2..c01e590e5efd1e8b216da107fcf941ff0f70aa3e 100644 --- a/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/public_address_statement.md @@ -1,34 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|--------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/setup.py | EfficientNet_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1704.04861 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git/tf_to_pytorch/convert_tf_to_pt/original_tf/efficientnet_model.py | EfficientNet_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | EfficientNet_for_PyTorch/infer/sdk/models/efficientnet/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb3a | 源码实现 | -| 开发引入 | / | EfficientNet_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1801.04381 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/EfficientNet_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/public_address_statement.md index 91ef9b5bb21516c7b127eb48ded5248e3f691d11..bc3c0917c7347c82b6d90e677ca3ec65e9ba39f9 100644 --- a/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/public_address_statement.md @@ -1,10 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------|---------------------------------------------------|-------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/timesler/facenet-pytorch.git | FaceNet_for_PyTorch/.github/FUNDING.yml | https://xscode.com/timesler/facenet-pytorch | 下载源码 | -| 开源代码引入 | https://github.com/timesler/facenet-pytorch.git | FaceNet_for_PyTorch/.gitmodules | https://github.com/davidsandberg/facenet.git | 下载源码 | -| 开源代码引入 | https://github.com/timesler/facenet-pytorch.git | FaceNet_for_PyTorch/models/inception_resnet_v1.py | https://github.com/timesler/facenet-pytorch/releases/download/v2.2.9/20180402-114759-vggface2.pt | 下载预训练模型 | -| 开源代码引入 | https://github.com/timesler/facenet-pytorch.git | FaceNet_for_PyTorch/models/inception_resnet_v1.py | https://github.com/timesler/facenet-pytorch/releases/download/v2.2.9/20180408-102900-casia-webface.pt | 下载预训练模型 | -| 开源代码引入 | https://github.com/timesler/facenet-pytorch.git | FaceNet_for_PyTorch/setup.py | tim.esler@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/timesler/facenet-pytorch.git | FaceNet_for_PyTorch/setup.py | https://github.com/timesler/facenet-pytorch | 下载源码 | -| 开发引入 | / | FaceNet_for_PyTorch/constant.py | 127.0.0.1 | 本机IP地址 | -| 开发引入 | / | FaceNet_for_PyTorch/models/utils/download.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 预训练模型 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|---------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/.github/FUNDING.yml | https://xscode.com/timesler/facenet-pytorch | 源码地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/fine_tune_new_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/FaceNet_for_PyTorch/setup.py | tim.esler@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/public_address_statement.md index 09e533ca98c828f1b5a751621afcea53bf900976..aef122d36383cd415f2e7cde01da7189ffd8a3b0 100644 --- a/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/public_address_statement.md @@ -1,751 +1,153 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------|-------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/mkdocs.yml | https://github.com/rwightman/pytorch-image-models | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用声明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用声明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/model-index.yml | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/model-index.yml | https://github.com/rwightman/pytorch-image-models | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/setup.py | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/setup.py | hello@rwightman.com | 邮箱 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/sotabench_setup.sh | https://github.com/mrT23/pillow-simd/zipball/simd/7.0.x | 下载第三方库 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/sotabench_setup.sh | https://onedrive.hyper.ai/down/ImageNet/data/ImageNet2012/ILSVRC2012_img_val.tar | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet51q_ra2-d47dcc76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_tiny-473c2a20.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_mini-2c6baf49.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_tiny-461b07a7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_mini-d7842000.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_small-fea1d5a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/releases/download/ghostnet_pth/ghostnet_1x.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0_ra2-45c6688d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l1_ra2-7dce93cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l2_ra3-da781a61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_730.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_781.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_809.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_820.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_distill_746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_distill_791.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_distill_819.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_distill_840.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tnt.py | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_base-e5ecb09b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_large-d273f802.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_small-42e5f78c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_base-c2265010.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_large-90f6aaa9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/visformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/visformer_small-839e1f5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_jit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/patch_embed.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://github.com/facebookresearch/LeViT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/cosine_lr.py | Gluon_ResNet50_v1b_for_PyTorch/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/involution.py | https://github.com/d-li14/involution/blob/main/cls/mmcls/models/utils/involution_naive.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cait.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://arxiv.org/pdf/2003.02436v1.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/swin_attn.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2106.05237 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/real_labels.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2102.05610 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/metaformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://github.com/naver-ai/pit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/swin_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/global_context.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/global_context.py | https://github.com/xvjiarui/GCNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/inplace_abn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cond_conv2d.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/densenet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://github.com/mlpc-ucsd/CoaT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adamw.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadamw.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://arxiv.org/abs/2104.06399 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/loader.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 下载链接 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.08050 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/regnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/visformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/visformer.py | https://github.com/danczs/Visformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/swin_transformer.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.01601 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/tanh_lr.py | Gluon_ResNet50_v1b_for_PyTorch/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/hfdocs/source/models.mdx | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.03404 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xcit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://arxiv.org/abs/2103.17239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/lambda_layer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/lambda_layer.py | https://github.com/lucidrains/lambda-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://arxiv.org/pdf/2104.13840.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/random_erasing.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/sotabench_setup.sh | https://onedrive.hyper.ai/down/ImageNet/data/ImageNet2012/ILSVRC2012_devkit_t12.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adamp.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://github.com/facebookresearch/convit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dpn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cbam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/random_erasing.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adafactor.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/lambda_layer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1b_for_PyTorch/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/google/automl/tree/master/efficientnetv2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cond_conv2d.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception_aligned.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1b_for_PyTorch/modelarts/start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/pit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pit.py | https://arxiv.org/abs/2103.16302 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/rexnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/squeeze_excite.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_features.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/radam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1801.04381v4 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/scheduler.py | Gluon_ResNet50_v1b_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tresnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/cosine_lr.py | Gluon_ResNet50_v1b_for_PyTorch/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vgg.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adahessian.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/misc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/convit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://arxiv.org/abs/2103.10697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/bottleneck_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/hfdocs/source/models.mdx | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/gather_excite.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/gather_excite.py | https://arxiv.org/abs/1810.12348 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/deit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/regnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1b_for_PyTorch/train.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dla.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/ghostnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/ghostnet.py | https://arxiv.org/abs/1911.11907 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/nasnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/Meituan-AutoML/Twins | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/std_conv.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/densenet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tnt.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tnt.py | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tresnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/pnasnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/mixed_conv2d.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adafactor.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2004.14525 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/coat.py | https://discuss.pytorch.org/t/how-to-keep-the-shape-of-input-and-output-same-when-dilation-conv/14338 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/2103.07579 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/rexnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/swin_attn.py | https://arxiv.org/pdf/2103.14030.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/lookahead.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://github.com/whai362/PVT.git | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_jit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/radam.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hardcorenas.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/lookahead.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/mlp.py | https://arxiv.org/abs/1612.08083","https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hardcorenas.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://arxiv.org/abs/2104.01136 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/ghostnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/tree/master/ghostnet_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/real_labels.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tnt.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/tnt.py | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/research/cv/TNT | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/weight_init.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://github.com/tensorflow/tpu/tree/bee9c4f6/models/official/resnet/resnet_rs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/visformer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/visformer.py | https://arxiv.org/abs/2104.12533 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1b_for_PyTorch/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cspnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mobilenetv3.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/2006.02049 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/google-research/vision_transformer/blob/linen/vit_jax/models_mixer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/std_conv.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/std_conv.py | https://github.com/joe-siyuan-qiao/WeightStandardization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/twins.py | https://arxiv.org/abs/2102.10882 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/involution.py | https://arxiv.org/abs/2103.06255 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cspnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/scheduler.py | Gluon_ResNet50_v1b_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1b_for_PyTorch/modelarts/start.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resmlp_24_224_raa-a8256759.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1b_for_PyTorch/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1b_for_PyTorch/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/global_context.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/global_context.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/selecsls.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/selecsls.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/std_conv.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model_ema.py | Gluon_ResNet50_v1b_for_PyTorch/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/non_local_attn.py | https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/gather_excite.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/gather_excite.py | https://github.com/hujie-frank/GENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dla.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1b_for_PyTorch/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1b_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用声明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用声明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1b_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | diff --git a/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/public_address_statement.md index 5908f378379f84a615f5e10a5bcbe94242e833f4..992399110f1822574212c8e6ec419b7ed9a59cf8 100644 --- a/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/public_address_statement.md @@ -1,751 +1,153 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------|-------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/mkdocs.yml | https://github.com/rwightman/pytorch-image-models | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用声明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用声明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/model-index.yml | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/model-index.yml | https://github.com/rwightman/pytorch-image-models | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/setup.py | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/setup.py | hello@rwightman.com | 邮箱 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/sotabench_setup.sh | https://github.com/mrT23/pillow-simd/zipball/simd/7.0.x | 下载第三方库 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/sotabench_setup.sh | https://onedrive.hyper.ai/down/ImageNet/data/ImageNet2012/ILSVRC2012_img_val.tar | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet51q_ra2-d47dcc76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_tiny-473c2a20.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_mini-2c6baf49.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_tiny-461b07a7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_mini-d7842000.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_small-fea1d5a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/releases/download/ghostnet_pth/ghostnet_1x.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0_ra2-45c6688d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l1_ra2-7dce93cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l2_ra3-da781a61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_730.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_781.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_809.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_820.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_distill_746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_distill_791.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_distill_819.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_distill_840.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tnt.py | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_base-e5ecb09b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_large-d273f802.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_small-42e5f78c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_base-c2265010.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_large-90f6aaa9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/visformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/visformer_small-839e1f5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/metaformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/regnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/real_labels.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adamw.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/deit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/global_context.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/global_context.py | https://github.com/xvjiarui/GCNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/google-research/vision_transformer/blob/linen/vit_jax/models_mixer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/std_conv.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/loader.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://arxiv.org/abs/2104.01136 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/misc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tresnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/involution.py | https://arxiv.org/abs/2103.06255 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dpn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/random_erasing.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2106.05237 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.03404 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/cosine_lr.py | Gluon_ResNet50_v1c_for_PyTorch/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2102.05610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_jit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/selecsls.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/visformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/visformer.py | https://arxiv.org/abs/2104.12533 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cbam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tnt.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tnt.py | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/research/cv/TNT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/bottleneck_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2004.14525 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/std_conv.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/rexnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/rexnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/nasnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/densenet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/mlp.py | https://arxiv.org/abs/1612.08083","https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/gather_excite.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/gather_excite.py | https://arxiv.org/abs/1810.12348 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/Meituan-AutoML/Twins | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/2103.07579 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/hfdocs/source/models.mdx | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/lookahead.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/sotabench_setup.sh | https://onedrive.hyper.ai/down/ImageNet/data/ImageNet2012/ILSVRC2012_devkit_t12.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://arxiv.org/abs/2102.10882 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1c_for_PyTorch/modelarts/train_modelarts.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/scheduler.py | Gluon_ResNet50_v1c_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1c_for_PyTorch/modelarts/train_modelarts.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/pit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://arxiv.org/abs/2103.16302 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dla.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tresnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://arxiv.org/abs/2104.06399 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mobilenetv3.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/2006.02049 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_features.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/lambda_layer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adamp.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/hfdocs/source/models.mdx | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1c_for_PyTorch/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://github.com/facebookresearch/LeViT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adahessian.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/involution.py | https://github.com/d-li14/involution/blob/main/cls/mmcls/models/utils/involution_naive.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/inplace_abn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/squeeze_excite.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/densenet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://discuss.pytorch.org/t/how-to-keep-the-shape-of-input-and-output-same-when-dilation-conv/14338 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/real_labels.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/tanh_lr.py | Gluon_ResNet50_v1c_for_PyTorch/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tnt.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/tnt.py | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/lambda_layer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/lambda_layer.py | https://github.com/lucidrains/lambda-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cspnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.01601 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/tensorflow/tpu/tree/bee9c4f6/models/official/resnet/resnet_rs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cond_conv2d.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cspnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/gather_excite.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/gather_excite.py | https://github.com/hujie-frank/GENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/google/automl/tree/master/efficientnetv2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cait.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://arxiv.org/pdf/2003.02436v1.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.08050 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hardcorenas.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hardcorenas.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/ghostnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/tree/master/ghostnet_pytorch | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1801.04381v4 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/selecsls.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/scheduler.py | Gluon_ResNet50_v1c_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resmlp_24_224_raa-a8256759.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/radam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/patch_embed.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adafactor.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadamw.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pit.py | https://github.com/naver-ai/pit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/swin_attn.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/std_conv.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/std_conv.py | https://github.com/joe-siyuan-qiao/WeightStandardization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception_aligned.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/visformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/visformer.py | https://github.com/danczs/Visformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/pnasnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model_ema.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cond_conv2d.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_jit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/convit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://arxiv.org/abs/2103.10697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/global_context.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/global_context.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vgg.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/non_local_attn.py | https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/coat.py | https://github.com/mlpc-ucsd/CoaT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://arxiv.org/pdf/2104.13840.pdf | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/mixed_conv2d.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/twins.py | https://github.com/whai362/PVT.git | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://github.com/facebookresearch/convit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/lookahead.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xcit.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://arxiv.org/abs/2103.17239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adafactor.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/ghostnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/ghostnet.py | https://arxiv.org/abs/1911.11907 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/swin_attn.py | https://arxiv.org/pdf/2103.14030.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/radam.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1c_for_PyTorch/train.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/cosine_lr.py | Gluon_ResNet50_v1c_for_PyTorch/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dla.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/swin_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1c_for_PyTorch/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/random_erasing.py | Gluon_ResNet50_v1c_for_PyTorch/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/weight_init.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1c_for_PyTorch/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/regnet.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2104.00298 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用声明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用声明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1c_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/public_address_statement.md index 0e1f7c4b200f054d81d80e8bf1802b6603293060..8e95038575ace0664655c9beba4b7a9e85fecd7b 100644 --- a/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/public_address_statement.md @@ -1,772 +1,153 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/mkdocs.yml | Gluon_ResNet50_v1d_for_PyTorch/mkdocs.yml | https://github.com/rwightman/pytorch-image-models | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/mkdocs.yml | Gluon_ResNet50_v1d_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用声明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/mkdocs.yml | Gluon_ResNet50_v1d_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用声明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/model-index.yml | Gluon_ResNet50_v1d_for_PyTorch/model-index.yml | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/model-index.yml | Gluon_ResNet50_v1d_for_PyTorch/model-index.yml | https://github.com/rwightman/pytorch-image-models | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/model-index.yml | Gluon_ResNet50_v1d_for_PyTorch/model-index.yml | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/setup.py | Gluon_ResNet50_v1d_for_PyTorch/setup.py | https://rwightman.github.io/pytorch-image-models/ | 开源代码仓 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/setup.py | Gluon_ResNet50_v1d_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench.py | Gluon_ResNet50_v1d_for_PyTorch//sotabench.py | https://github.com/zhanghang1989/ResNeSt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench_setup.sh | Gluon_ResNet50_v1d_for_PyTorch//sotabench_setup.sh | https://github.com/mrT23/pillow-simd/zipball/simd/7.0.x | 下载三方库 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/sotabench_setup.sh | Gluon_ResNet50_v1d_for_PyTorch//sotabench_setup.sh | https://onedrive.hyper.ai/down/ImageNet/data/ImageNet2012/ILSVRC2012_img_val.tar | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet51q_ra2-d47dcc76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_tiny-473c2a20.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_mini-2c6baf49.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_tiny-461b07a7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_mini-d7842000.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_small-fea1d5a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/convit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/convit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/convit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cspnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cspnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/cspnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/ghostnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/releases/download/ghostnet_pth/ghostnet_1x.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/gluon_xception.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/inception_v4.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nasnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0_ra2-45c6688d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l1_ra2-7dce93cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l2_ra3-da781a61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/nfnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_730.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_781.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_809.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_820.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_distill_746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_distill_791.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_distill_819.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_distill_840.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/pnasnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/rexnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/rexnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/rexnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/rexnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/selecsls.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/selecsls.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/selecsls.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/sknet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/sknet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/sknet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tnt.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tnt.py | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_base-e5ecb09b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_large-d273f802.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_small-42e5f78c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_base-c2265010.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_large-90f6aaa9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/visformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/visformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/visformer_small-839e1f5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/xception.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/xception_aligned.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/xception_aligned.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/tree/main/timm/models/xception_aligned.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adahessian.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/real_labels.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adamp.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adafactor.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/std_conv.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/std_conv.py | https://github.com/joe-siyuan-qiao/WeightStandardization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/loader.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/bottleneck_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://arxiv.org/pdf/2104.13840.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/swin_attn.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/lookahead.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2102.05610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/global_context.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/global_context.py | https://github.com/xvjiarui/GCNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/mlp.py | https://arxiv.org/abs/1612.08083","https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/visformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/visformer.py | https://arxiv.org/abs/2104.12533 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/random_erasing.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/pit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://arxiv.org/abs/2103.16302 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/whai362/PVT.git | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/lambda_layer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/ghostnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/ghostnet.py | https://arxiv.org/abs/1911.11907 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/nasnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/deit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/involution.py | https://github.com/d-li14/involution/blob/main/cls/mmcls/models/utils/involution_naive.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://discuss.pytorch.org/t/how-to-keep-the-shape-of-input-and-output-same-when-dilation-conv/14338 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/mixed_conv2d.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadamw.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/tensorflow/tpu/tree/bee9c4f6/models/official/resnet/resnet_rs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/2103.07579 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_jit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/sknet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/sotabench_setup.sh | https://onedrive.hyper.ai/down/ImageNet/data/ImageNet2012/ILSVRC2012_devkit_t12.tar.gz | 下载链接 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1801.04381v4 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/selecsls.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2106.05237 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/random_erasing.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1d_for_PyTorch/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dpn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/ghostnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/tree/master/ghostnet_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/squeeze_excite.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1d_for_PyTorch/modelarts/train_modelarts.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vovnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/misc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cond_conv2d.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/std_conv.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.03404 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/non_local_attn.py | https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tnt.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tnt.py | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/cosine_lr.py | Gluon_ResNet50_v1d_for_PyTorch/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://arxiv.org/abs/2104.01136 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/convit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://arxiv.org/abs/2103.10697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2004.14525 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xcit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://arxiv.org/abs/2103.17239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1d_for_PyTorch/modelarts/train_modelarts.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/patch_embed.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/dla.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/visformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/visformer.py | https://github.com/danczs/Visformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resmlp_24_224_raa-a8256759.pth | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/eca.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cspnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/radam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/byobnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cond_conv2d.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/regnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/metaformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/scheduler.py | Gluon_ResNet50_v1d_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/gather_excite.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/gather_excite.py | https://github.com/hujie-frank/GENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_jit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_resnet_v2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/tanh_lr.py | Gluon_ResNet50_v1d_for_PyTorch/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/lookahead.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://github.com/facebookresearch/LeViT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nvnovograd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/global_context.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/global_context.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vgg.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/pnasnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnetv2.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/rexnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_sam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/weight_init.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/cosine_lr.py | Gluon_ResNet50_v1d_for_PyTorch/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/swin_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://arxiv.org/abs/2104.06399 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/auto_augment.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/scheduler/scheduler.py | Gluon_ResNet50_v1d_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/train.py | Gluon_ResNet50_v1d_for_PyTorch/train.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception_aligned.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/selecsls.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/inplace_abn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.01601 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnest.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/agc.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/rmsprop_tf.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/pit.py | https://github.com/naver-ai/pit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/hfdocs/source/models.mdx | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/std_conv.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/drop.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/twins.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://arxiv.org/abs/2102.10882 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mobilenetv3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/2006.02049 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/inception_v3.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/real_labels.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/non_local_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adamw.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hardcorenas.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://github.com/facebookresearch/convit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/mixup.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_features.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/sgdp.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/utils/model_ema.py | Gluon_ResNet50_v1d_for_PyTorch/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.08050 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/data/readers/reader_tfds.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cait.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://arxiv.org/pdf/2003.02436v1.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/docs/archived_changes.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/swin_transformer.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/involution.py | https://arxiv.org/abs/2103.06255 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/levit.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adafactor.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_blocks.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/nadam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/README.md | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/coat.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/coat.py | https://github.com/mlpc-ucsd/CoaT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/loss/jsd.py | Gluon_ResNet50_v1d_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/_efficientnet_builder.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/adabelief.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/twins.py | https://github.com/Meituan-AutoML/Twins | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开发引入 | / | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/swin_attn.py | https://arxiv.org/pdf/2103.14030.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/cspnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/hfdocs/source/models.mdx | Gluon_ResNet50_v1d_for_PyTorch/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/resnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tnt.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tnt.py | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/research/cv/TNT | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/optim/radam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/senet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/rexnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/hrnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/xception.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/lambda_layer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/lambda_layer.py | https://github.com/lucidrains/lambda-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/google-research/vision_transformer/blob/linen/vit_jax/models_mixer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://github.com/google/automl/tree/master/efficientnetv2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/res2net.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/halo_attn.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/mlp_mixer.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/activations_me.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/tresnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/densenet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/cbam.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/vision_transformer_hybrid.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/layers/gather_excite.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/layers/gather_excite.py | https://arxiv.org/abs/1810.12348 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git/timm/models/efficientnet.py | Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用声明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用声明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Gluon_ResNet50_v1d_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/public_address_statement.md index 6f72df9813c29a29caf8ae85cb3d5d01421962e3..547edb04e4ec2d1186325f093194a15981f485bd 100644 --- a/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/public_address_statement.md @@ -1,193 +1,101 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------------------------------------------|------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|-------------------------| -| 开发引入 | / | url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | conda安装脚本 | -| 开发引入 | / | url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | conda安装脚本 | -| 开发引入 | / | url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | conda安装脚本 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/Makefile | Googlenet_ID0447_for_PyTorch/docs/Makefile | http://pytorch.org/vision/ | pytorch官网 | -| 开发引入 | / | url.ini | https://fonts.googleapis.com/css?family=Lato | 字体信息 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/source/conf.py | Googlenet_ID0447_for_PyTorch/docs/source/conf.py | https://docs.python.org/ | python官网 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/source/conf.py | Googlenet_ID0447_for_PyTorch/docs/source/conf.py | http://docs.scipy.org/doc/numpy/ | numpy官网 | -| 开发引入 | / | url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | conda安装包 | -| 开发引入 | / | url.ini | https://github.com/pytorch/vision | pytorch官网 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/packaging/pkg_helpers.bash | Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/nightly/ | 三方库下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/packaging/pkg_helpers.bash | Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/packaging/pkg_helpers.bash | Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/nightly/torch_nightly.html | 三方库下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/packaging/torchvision/meta.yaml | Googlenet_ID0447_for_PyTorch/packaging/torchvision/meta.yaml | https://github.com/pytorch/vision | pytorch官网 | -| 开发引入 | / | url.ini | https://github.com/pytorch/vision | pytorch官网 | -| 开发引入 | / | url.ini | https://download.pytorch.org/whl/nightly | 三方库下载 | -| 开发引入 | / | url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | conda安装脚本 | -| 开发引入 | / | url.ini | https://github.com/pytorch/vision | pytorch官网 | -| 开发引入 | / | url.ini | https://download.pytorch.org/whl/nightly | 三方库下载 | -| 开发引入 | / | url.ini | https://download.pytorch.org/whl | 三方库下载 | -| 开发引入 | / | url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | conda安装包 | -| 开发引入 | / | url.ini | https://www.7-zip.org/a/7z1805-x64.exe | 7-zip下载 | -| 开发引入 | / | url.ini | https://dev.azure.com/pytorch | azure官网 | -| 开发引入 | / | url.ini | https://github.com | github官网 | -| 开发引入 | / | url.ini | https://www.dropbox.com/s/z5b7ryz0zrimntl/cuda_9.0.176_windows.7z?dl=1 | cuda9.0.176下载 | -| 开发引入 | / | url.ini | https://www.dropbox.com/s/6p0xyqh472nu8m1/cudnn-9.0-windows7-x64-v7.zip?dl=1 | cudnn9.0下载 | -| 开发引入 | / | url.ini | https://www.dropbox.com/s/7a4sbq0dln6v7t2/cuda_9.1.85_windows.7z?dl=1 | cuda9.1.85下载 | -| 开发引入 | / | url.ini | https://www.dropbox.com/s/e0prhgsrbyfi4ov/cudnn-9.1-windows7-x64-v7.zip?dl=1 | cudnn9.1下载 | -| 开发引入 | / | url.ini | https://www.dropbox.com/s/9mcolalfdj4n979/NvToolsExt.7z?dl=1 | NvToolsExt下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/packaging/windows/internal/vs_install.bat | Googlenet_ID0447_for_PyTorch/packaging/windows/internal/vs_install.bat | https://aka.ms/vs/15/release/vs_buildtools.exe | vs_buildtools下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/setup.py | Googlenet_ID0447_for_PyTorch/setup.py | soumith@pytorch.org | 作者邮箱 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/setup.py | Googlenet_ID0447_for_PyTorch/setup.py | https://github.com/pytorch/vision | vision代码仓 | -| 开发引入 | / | url.ini | http://github.com/pytorch/vision/archive/master.zip | vision代码下载 | -| 开发引入 | / | url.ini | https://github.com/pytorch/vision/archive/master.zip | vision代码下载 | -| 开发引入 | / | url.ini | http://github.com/pytorch/vision/archive/this_doesnt_exist.zip | this_doesnt_exist.zip下载 | -| 开发引入 | / | url.ini | https://download.pytorch.org/vision_tests/io/ | vision_tests下载 | -| 开发引入 | / | url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech101/101_ObjectCategories.tar.gz | 数据集下载 | -| 开发引入 | / | url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech101/Annotations.tar | 数据集下载 | -| 开发引入 | / | url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech256/256_ObjectCategories.tar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/cifar.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/cifar.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/hmdb51.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/hmdb51.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 数据集下载 | -| 开发引入 | / | url.ini | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_train.tar | 数据集下载 | -| 开发引入 | / | url.ini | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_val.tar | 数据集下载 | -| 开发引入 | / | url.ini | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_devkit_t12.tar.gz | 数据集下载 | -| 开发引入 | / | url.ini | http://yann.lecun.com/exdb/mnist/ | 数据集下载 | -| 开发引入 | / | url.ini | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/ | 数据集下载 | -| 开发引入 | / | url.ini | http://codh.rois.ac.jp/kmnist/dataset/kmnist/ | 数据集下载 | -| 开发引入 | / | url.ini | https://cloudstor.aarnet.edu.au/plus/index.php/s/54h3OuGJhFLwAlQ/download | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-images-idx3-ubyte.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-labels-idx2-int.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-images-idx3-ubyte.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-labels-idx2-int.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-images-idx3-ubyte.xz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-labels-idx2-int.xz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/omniglot.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/omniglot.py | https://github.com/brendenlake/omniglot/raw/master/python | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/notredame_harris.zip | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/yosemite_harris.zip | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/liberty_harris.zip | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/notredame.zip | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/yosemite.zip | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/liberty.zip | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/sbd.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/sbd.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/sbd.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/sbd.py | http://home.bharathh.info/pubs/codes/SBD/train_noval.txt | 数据集下载 | -| 开发引入 | / | url.ini | http://www.cs.virginia.edu/~vicente/sbucaptions/SBUCaptionedPhotoDataset.tar.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/semeion.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/semeion.py | http://archive.ics.uci.edu/ml/machine-learning-databases/semeion/semeion.data | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/stl10.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/stl10.py | http://ai.stanford.edu/~acoates/stl10/stl10_binary.tar.gz | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/svhn.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/train_32x32.mat | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/svhn.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/test_32x32.mat | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/svhn.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/extra_32x32.mat | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/usps.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/usps.py | https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/usps.bz2 | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/usps.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/usps.py | https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass/usps.t.bz2 | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/utils.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/utils.py | https://docs.google.com/uc?export=download | 驱动文件下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/voc.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/voc.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2011/VOCtrainval_25-May-2011.tar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/voc.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2010/VOCtrainval_03-May-2010.tar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/voc.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2009/VOCtrainval_11-May-2009.tar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/voc.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2008/VOCtrainval_14-Jul-2008.tar | 数据集下载 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/voc.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集下载 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/detection/faster_rcnn.py | Googlenet_ID0447_for_PyTorch/torchvision/models/detection/faster_rcnn.py | https://download.pytorch.org/models/fasterrcnn_resnet50_fpn_coco-258fb6c6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/detection/keypoint_rcnn.py | Googlenet_ID0447_for_PyTorch/torchvision/models/detection/keypoint_rcnn.py | https://download.pytorch.org/models/keypointrcnn_resnet50_fpn_coco-9f466800.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/detection/mask_rcnn.py | Googlenet_ID0447_for_PyTorch/torchvision/models/detection/mask_rcnn.py | https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/googlenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/googlenet.py | https://download.pytorch.org/models/googlenet-1378be20.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/mobilenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/mobilenet.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/segmentation/segmentation.py | Googlenet_ID0447_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/segmentation/segmentation.py | Googlenet_ID0447_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/shufflenetv2.py | Googlenet_ID0447_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/shufflenetv2.py | Googlenet_ID0447_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/vgg.py | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/vgg.py | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/vgg.py | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/vgg.py | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/vgg.py | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/vgg.py | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/video/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r3d_18-b3b3357e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/video/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/mc3_18-a90a0ba3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/video/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r2plus1d_18-91a641e6.pth | 下载权重文件 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 预训练模型 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/kinetics.py | https://deepmind.com/research/open-source/open-source-datasets/kinetics/ | 数据集地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/cityscapes.py | http://www.cityscapes-dataset.com/ | 数据集地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/squeezenet.py | Googlenet_ID0447_for_PyTorch/torchvision/csrc/models/squeezenet.h | https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1 | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/segmentation/deeplabv3.py | Googlenet_ID0447_for_PyTorch/torchvision/models/segmentation/deeplabv3.py | https://arxiv.org/abs/1706.05587 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/source/models/squeezenet.rst | Googlenet_ID0447_for_PyTorch/torchvision/models/squeezenet.py | https://arxiv.org/abs/1602.07360 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/csrc/models/densenet.h | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/semeion.py | http://archive.ics.uci.edu/ml/datasets/semeion+handwritten+digit | 数据集地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://github.com/pytorch/pytorch/pull/23408 | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/make.bat | Googlenet_ID0447_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/resource/hmdb-a-large-human-motion-database/ | 数据集地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/transforms/transforms.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/ucf101.py | https://www.crcv.ucf.edu/data/UCF101.php | 数据集地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/models/mnasnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/flickr.py | http://web.engr.illinois.edu/~bplumme2/Flickr30kEntities/ | 数据集地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/utils.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/utils.py | https://stackoverflow.com/questions/38511444/python-download-files-from-google-drive-using-url | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/utils.py | https://gist.github.com/anonymous/bf16430f7750c023141c562f3e9f2a91 | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/prototype/datasets/_builtin/cifar.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar.html | 数据集地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/prototype/datasets/_builtin/sbd.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/sbd.py | http://home.bharathh.info/pubs/codes/SBD/download.html | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/references/video_classification/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/stl10.py | https://cs.stanford.edu/~acoates/stl10/ | 数据集地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/transforms/v2/_type_conversion.py | Googlenet_ID0447_for_PyTorch/torchvision/transforms/functional.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#concept-modes | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/cityscapes.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts | 源码实现 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/coco.py | http://mscoco.org/dataset/#captions-challenge2015 | 数据集地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://www.westernsydney.edu.au/bens/home/reproducible_research/emnist | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/references/video_classification/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/squeezenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/squeezenet.py | https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1 | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/transforms/v2/_geometry.py | Googlenet_ID0447_for_PyTorch/torchvision/transforms/transforms.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/ops/misc.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/transforms/functional.py | Googlenet_ID0447_for_PyTorch/torchvision/transforms/functional.py | https://en.wikipedia.org/wiki/Hue | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/video/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://arxiv.org/abs/1711.11248 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/io/video.py | Googlenet_ID0447_for_PyTorch/torchvision/io/video.py | https://github.com/FFmpeg/FFmpeg/commit/d5a21172283572af587b3d939eba0091484d3263 | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/transforms/v2/_geometry.py | Googlenet_ID0447_for_PyTorch/torchvision/transforms/functional.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#filters | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/packaging/windows/internal/auth.bat | https://docs.microsoft.com/en-us/azure/devops/pipelines/build/triggers?tabs=yaml&view=vsts#my-build-didnt-run-what-happened | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/alexnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/alexnet.py | https://arxiv.org/abs/1404.5997 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/lsun.py | http://lsun.cs.princeton.edu | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/prototype/datasets/_builtin/usps.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/usps.py | https://www.csie.ntu.edu.tw/~cjlin/libsvmtools/datasets/multiclass.html#usps | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/references/classification/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/packaging/conda/build_vision.sh | https://github.com/conda/conda-build/issues/3285 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/mnasnet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 预训练模型 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/references/classification/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/quantization/inception.py | Googlenet_ID0447_for_PyTorch/torchvision/csrc/models/inception.h | http://arxiv.org/abs/1512.00567 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/ops/poolers.py | Googlenet_ID0447_for_PyTorch/torchvision/ops/feature_pyramid_network.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/sbd.py | https://docs.scipy.org/doc/ | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/shufflenetv2.py | Googlenet_ID0447_for_PyTorch/torchvision/models/shufflenetv2.py | https://arxiv.org/abs/1807.11164 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/imagenet.py | http://image-net.org/ | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/transforms/functional.py | Googlenet_ID0447_for_PyTorch/torchvision/transforms/functional.py | https://en.wikipedia.org/wiki/Gamma_correction | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/flickr.py | http://nlp.cs.illinois.edu/HockenmaierGroup/8k-pictures.html | 数据集地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/test/test_cpp_models.py | https://github.com/pytorch/vision/issues/1191 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/source/conf.py | Googlenet_ID0447_for_PyTorch/docs/source/conf.py | http://stackoverflow.com/a/41184353/3343043 | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/coco.py | http://mscoco.org/dataset/#detections-challenge2016 | 数据集地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/prototype/datasets/_builtin/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://github.com/facebookresearch/qmnist | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/_utils.py | Googlenet_ID0447_for_PyTorch/torchvision/models/mobilenet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/quantization/mobilenetv2.py | Googlenet_ID0447_for_PyTorch/torchvision/models/mobilenet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/quantization/googlenet.py | Googlenet_ID0447_for_PyTorch/torchvision/models/googlenet.py | http://arxiv.org/abs/1409.4842 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/celeba.py | http://mmlab.ie.cuhk.edu.hk/projects/CelebA.html | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/references/similarity/loss.py | Googlenet_ID0447_for_PyTorch/references/similarity/loss.py | https://omoindrot.github.io/triplet-loss | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/prototype/datasets/_builtin/mnist.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://github.com/zalandoresearch/fashion-mnist | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/samplers/clip_sampler.py | Googlenet_ID0447_for_PyTorch/references/video_classification/sampler.py | https://github.com/pytorch/pytorch/issues/23430 | 相关说明 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/io/video.py | https://github.com/mikeboers/PyAV#installation | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | Googlenet_ID0447_for_PyTorch/torchvision/csrc/models/resnet.h | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/quantization/inception.py | Googlenet_ID0447_for_PyTorch/torchvision/models/inception.py | http://arxiv.org/abs/1512.00567 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/transforms/v2/_type_conversion.py | Googlenet_ID0447_for_PyTorch/torchvision/transforms/transforms.py | https://pillow.readthedocs.io/en/latest/handbook/concepts.html#concept-modes | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/docs/source/models/squeezenet.rst | Googlenet_ID0447_for_PyTorch/torchvision/csrc/models/squeezenet.h | https://arxiv.org/abs/1602.07360 | 论文地址 | -| 开发引入 | / | Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://github.com/rois-codh/kmnist | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/references/similarity/loss.py | Googlenet_ID0447_for_PyTorch/references/similarity/loss.py | https://github.com/omoindrot/tensorflow-triplet-loss | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/phototour.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://phototour.cs.washington.edu/patches/default.htm | 相关说明 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/ops/boxes.py | Googlenet_ID0447_for_PyTorch/torchvision/ops/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/models/alexnet.py | Googlenet_ID0447_for_PyTorch/torchvision/csrc/models/alexnet.h | https://arxiv.org/abs/1404.5997 | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/main/torchvision/datasets/folder.py | Googlenet_ID0447_for_PyTorch/torchvision/datasets/folder.py | https://github.com/python-pillow/Pillow/issues/835 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|----------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/.circleci/config.yml.in | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/nightly/${WHEEL_DIR}torch_nightly.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/nightly/torch_nightly.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/packaging/pkg_helpers.bash | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/setup.py | soumith@pytorch.org | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/hmdb51.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-images-idx3-ubyte.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-test-labels-idx2-int.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/qmnist-train-labels-idx2-int.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-images-idx3-ubyte.xz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/mnist.py | https://raw.githubusercontent.com/facebookresearch/qmnist/master/xnist-labels-idx2-int.xz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/yosemite_harris.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/notredame_harris.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://matthewalunbrown.com/patchdata/liberty_harris.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/yosemite.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/notredame.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/phototour.py | http://icvl.ee.ic.ac.uk/vbalnt/liberty.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/sbd.py | http://home.bharathh.info/pubs/codes/SBD/train_noval.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/sbd.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/semeion.py | http://archive.ics.uci.edu/ml/machine-learning-databases/semeion/semeion.data | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/stl10.py | http://ai.stanford.edu/~acoates/stl10/stl10_binary.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/train_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/test_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/svhn.py | http://ufldl.stanford.edu/housenumbers/extra_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2008/VOCtrainval_14-Jul-2008.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2009/VOCtrainval_11-May-2009.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2010/VOCtrainval_03-May-2010.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2011/VOCtrainval_25-May-2011.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/detection/faster_rcnn.py | https://download.pytorch.org/models/fasterrcnn_resnet50_fpn_coco-258fb6c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/detection/keypoint_rcnn.py | https://download.pytorch.org/models/keypointrcnn_resnet50_fpn_coco-9f466800.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/detection/mask_rcnn.py | https://download.pytorch.org/models/maskrcnn_resnet50_fpn_coco-bf2d0c1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/googlenet.py | https://download.pytorch.org/models/googlenet-1378be20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/mobilenet.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/fcn_resnet101_coco-7ecb50ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/segmentation/segmentation.py | https://download.pytorch.org/models/deeplabv3_resnet101_coco-586e9e4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r3d_18-b3b3357e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/r2plus1d_18-91a641e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/torchvision/models/video/resnet.py | https://download.pytorch.org/models/mc3_18-a90a0ba3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://www.7-zip.org/a/7z1805-x64.exe | 依赖地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/vision_tests/io/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech256/256_ObjectCategories.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://www.vision.caltech.edu/Image_Datasets/Caltech101/101_ObjectCategories.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/whl | 安装依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://www.cs.virginia.edu/~vicente/sbucaptions/SBUCaptionedPhotoDataset.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://www.dropbox.com/s/9mcolalfdj4n979/NvToolsExt.7z?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/whl/nightly | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://yann.lecun.com/exdb/mnist/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Windows-x86_64.exe | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://repo.anaconda.com/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://dev.azure.com/pytorch | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_devkit_t12.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://www.dropbox.com/s/e0prhgsrbyfi4ov/cudnn-9.1-windows7-x64-v7.zip?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://www.dropbox.com/s/6p0xyqh472nu8m1/cudnn-9.0-windows7-x64-v7.zip?dl=1 | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://codh.rois.ac.jp/kmnist/dataset/kmnist/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | ftp://ftp.wwpdb.org/pub/pdb/data/status/obsolete.dat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_val.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://www.image-net.org/challenges/LSVRC/2012/nnoupb/ILSVRC2012_img_train.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://fonts.googleapis.com/css?family=Lato | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://www.dropbox.com/s/7a4sbq0dln6v7t2/cuda_9.1.85_windows.7z?dl=1 | cuda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://www.dropbox.com/s/z5b7ryz0zrimntl/cuda_9.0.176_windows.7z?dl=1 | cuda下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Googlenet_ID0447_for_PyTorch/url.ini | https://cloudstor.aarnet.edu.au/plus/index.php/s/54h3OuGJhFLwAlQ/download | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/MobileNetV2_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/MobileNetV2_for_PyTorch/public_address_statement.md index 7dc9d77fd30bca99d276eb75cb8cb7272ffcada9..f59edf67f5a95d3598e78470b61c91a757e3c6ca 100644 --- a/PyTorch/built-in/cv/classification/MobileNetV2_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/MobileNetV2_for_PyTorch/public_address_statement.md @@ -1,6 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|----------------------------------------------------------------------|--------| -| 开发引入 | / | url.ini | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 下载权重文件 | -| 开发引入 | / | MobileNetV2_for_PyTorch/train/mobilenet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | MobileNetV2_for_PyTorch/train/mobilenet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | MobileNetV2_for_PyTorch/infer/sdk/models/mobilenet/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb3a | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/MobileNetV2_for_PyTorch/train/mobilenetv2_8p_main_anycard.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/MobileNetV2_for_PyTorch/url.ini | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/public_address_statement.md index 8486b5147a84956c489b45b047465e0440dcad04..2fffa32c88870ebd304bb1f26543f5a312ec796d 100644 --- a/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/public_address_statement.md @@ -1,752 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|----------------------------------------------------------------------|--------| -| 开发引入 | / | url.ini | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/mobilenet_v3_large-8738ca79.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/mobilenet_v3_small-047dcff4.pth | 下载权重文件 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/mobilenetv2.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/mobilenetv2.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/main.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | MobileNetV3-Large_ID1784_for_PyTorch/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/.github/ISSUE_TEMPLATE/config.yml | MobileNetV3_large_100_for_PyTorch/validate.py | https://github.com/rwightman | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | MobileNetV3_large_100_for_PyTorch/train_8p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | MobileNetV3_large_100_for_PyTorch/train_8p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/.github/ISSUE_TEMPLATE/config.yml | MobileNetV3_large_100_for_PyTorch/train_8p.py | https://github.com/rwightman | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | MobileNetV3_large_100_for_PyTorch/train_1p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | MobileNetV3_large_100_for_PyTorch/train_1p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/.github/ISSUE_TEMPLATE/config.yml | MobileNetV3_large_100_for_PyTorch/train_1p.py | https://github.com/rwightman | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model_ema.py | MobileNetV3_large_100_for_PyTorch/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model_ema.py | MobileNetV3_large_100_for_PyTorch/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/misc.py | MobileNetV3_large_100_for_PyTorch/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/agc.py | MobileNetV3_large_100_for_PyTorch/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/tanh_lr.py | MobileNetV3_large_100_for_PyTorch/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/cosine_lr.py | MobileNetV3_large_100_for_PyTorch/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/sgdp.py | MobileNetV3_large_100_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamp.py | MobileNetV3_large_100_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | MobileNetV3_large_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | MobileNetV3_large_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | MobileNetV3_large_100_for_PyTorch/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | MobileNetV3_large_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | MobileNetV3_large_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nvnovograd.py | MobileNetV3_large_100_for_PyTorch/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/optim/npu_fused_sgd.py | http://www.cs.toronto.edu/%7Ehinton/absps/momentum.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | MobileNetV3_large_100_for_PyTorch/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | MobileNetV3_large_100_for_PyTorch/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | MobileNetV3_large_100_for_PyTorch/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | MobileNetV3_large_100_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | MobileNetV3_large_100_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | MobileNetV3_large_100_for_PyTorch/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamp.py | MobileNetV3_large_100_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamp.py | MobileNetV3_large_100_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adahessian.py | MobileNetV3_large_100_for_PyTorch/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception_aligned.py | MobileNetV3_large_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/xception.md | MobileNetV3_large_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/xception.md | MobileNetV3_large_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/xception.md | MobileNetV3_large_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception.py | MobileNetV3_large_100_for_PyTorch/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception.py | MobileNetV3_large_100_for_PyTorch/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/xception.md | MobileNetV3_large_100_for_PyTorch/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception.py | MobileNetV3_large_100_for_PyTorch/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ese-vovnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ese-vovnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/pos_embed_sincos.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/vision-transformer.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_80_8-dbc13962.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/advprop.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/skresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/skresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/skresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pnasnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-senet.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/legacy-se-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/selecsls.py | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/.github/ISSUE_TEMPLATE/config.yml | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/selecsls.md | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/selecsls.md | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/selecsls.md | MobileNetV3_large_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/rexnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/rexnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/rexnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/rexnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/big-transfer.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/big-transfer.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/big-transfer.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/big-transfer.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/big-transfer.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/big-transfer.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x1.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x3.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R101x3.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R101x3.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x2.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/big-transfer.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x4.npz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet-d.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/wide-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/wide-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ig-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ig-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ig-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ig-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ssl-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ssl-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ssl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ssl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ssl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ssl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/swsl-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/swsl-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/swsl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/swsl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/swsl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/swsl-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/se-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/seresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/seresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/seresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ecaresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ecaresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ecaresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ecaresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ecaresnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/resnest.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | MobileNetV3_large_100_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2net.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2net.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2net.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2net.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2net.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2net.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/res2next.md | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | MobileNetV3_large_100_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnetx.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_160-d64013cd.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/regnety.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pnasnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pnasnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/pnasnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/nasnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mobilenet-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/weight_init.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/std_conv.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/std_conv.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/se.py | https://arxiv.org/abs/1911.06667 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/mixed_conv2d.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/inplace_abn.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/inplace_abn.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/inplace_abn.py | inplace_abn.git@v1.0.12 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/eca.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/eca.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/eca.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/cond_conv2d.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/cond_conv2d.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/cond_conv2d.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations.py | MobileNetV3_large_100_for_PyTorch/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/inception-v4.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/inception-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-inception-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/adversarial-inception-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-inception-v3.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/inception-resnet-v2.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/ensemble-adversarial.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/hrnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-xception.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-seresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-seresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-seresnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/gloun-senet.md | MobileNetV3_large_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_features.py | MobileNetV3_large_100_for_PyTorch/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/cond_conv2d.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mnasnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mnasnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mobilenet-v2.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mobilenet-v2.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mobilenet-v2.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mobilenet-v2.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/fbnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/spnasnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet-pruned.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet-pruned.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/efficientnet-pruned.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/advprop.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/noisy-student.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-condconv.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-condconv.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-condconv.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-lite.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-lite.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-lite.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-lite.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-efficientnet-lite.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/tf-mixnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dpn.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dpn.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dpn.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dpn.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dpn.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dpn.md | MobileNetV3_large_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/dla.md | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/densenet.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/densenet.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/densenet.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/densenet.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/densenet.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/densenet.md | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/csp-resnet.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/csp-resnext.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models/.templates/models/csp-darknet.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/archived_changes.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/docs/models.md | MobileNetV3_large_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | MobileNetV3_large_100_for_PyTorch/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/results/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/results/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/random_erasing.py | MobileNetV3_large_100_for_PyTorch/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/random_erasing.py | MobileNetV3_large_100_for_PyTorch/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/readers/reader_tfds.py | MobileNetV3_large_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/readers/reader_tfds.py | MobileNetV3_large_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/readers/reader_tfds.py | MobileNetV3_large_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/readers/reader_tfds.py | MobileNetV3_large_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/mixup.py | MobileNetV3_large_100_for_PyTorch/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/loader.py | MobileNetV3_large_100_for_PyTorch/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | MobileNetV3_large_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/.github/ISSUE_TEMPLATE/config.yml | MobileNetV3_large_100_for_PyTorch/setup.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | MobileNetV3_large_100_for_PyTorch/modelarts/train_start.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | MobileNetV3_large_100_for_PyTorch/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/.github/ISSUE_TEMPLATE/config.yml | MobileNetV3_large_100_for_PyTorch/modelarts/train_start.py | https://github.com/rwightma | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|---------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/url.ini | https://download.pytorch.org/models/mobilenet_v3_small-047dcff4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/url.ini | https://download.pytorch.org/models/mobilenet_v3_large-8738ca79.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/MobileNetV3-Large_ID1784_for_PyTorch/url.ini | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/ResNet1001_1202_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/ResNet1001_1202_for_PyTorch/public_address_statement.md index 044822b423444b5337ed16fc4954b36b658d11d7..08c762a20525715b8cd2649bb54d11dd511deff1 100644 --- a/PyTorch/built-in/cv/classification/ResNet1001_1202_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/ResNet1001_1202_for_PyTorch/public_address_statement.md @@ -1,58 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1710.05941 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1905.02244 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1905.02244 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1707.01083 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet_cifar.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/resnet.py|ResNet1001_1202_for_PyTorch/resnet.py | https://arxiv.org/abs/1512.03385 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1707.01083 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/chainer_/chainercv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1709.01507 | 引用论文参考地址 | -| 开源代码引入 | https://github.com/osmr/imgclsmob/blob/master/gluon/gluoncv2/models/common.py|ResNet1001_1202_for_PyTorch/common.py | https://arxiv.org/abs/1807.09441 | 引用论文参考地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/ResNet1001_1202_for_PyTorch/DistributedResnet/main_apex_npu.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/ResNet1001_1202_for_PyTorch/pytorch_resnet_apex.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/ResNet50_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/ResNet50_for_PyTorch/public_address_statement.md index f114649980fbe393575d42e6de493de55f2dfdd6..4929c4bdbac8afe54221798344f2d9a3e5af9047 100644 --- a/PyTorch/built-in/cv/classification/ResNet50_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/ResNet50_for_PyTorch/public_address_statement.md @@ -1,5 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入 | / | ResNet50_for_PyTorch/DistributedResnet50/image_classification/training.py | https://www.github.com/nvidia/ap | 模型相关说明 | -| 开发引入 | / | ResNet50_for_PyTorch/infer/mxbase/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb | 模型相关说明 | -| 开发引入 | / | ResNet50_for_PyTorch/infer/sdk/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/ResNet50_for_PyTorch/DistributedResnet50/main_apex_d76_npu.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/ResNet50_for_PyTorch/pytorch_resnet50_apex.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/public_address_statement.md index 485a3317a17258ca24f7c34f98d493401f8d62b8..92d78ec635f56a266514c07ab1d6c096ae88646c 100644 --- a/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/public_address_statement.md @@ -1,600 +1,263 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/CITATION.cff | Resnet50_Cifar_for_PyTorch/CITATION.cff | https://github.com/open-mmlab/mmclassification | 开源代码仓说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://arxiv.org/abs/2105.03889 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.19.0/mmcls/models/backbones/conformer.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-tiny-p16_3rdparty_8xb128_in1k_20211206-f6860372.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://drive.google.com/file/d/19SxGhKcWOR5oQSxNUWUM2MGYiaWMrF1z/view?usp=sharing | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://github.com/pengzhiliang/Conformer/blob/main/models.py#L65 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-small-p16_3rdparty_8xb128_in1k_20211206-3065dcf5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://drive.google.com/file/d/1mpOlbLaVxOfEwV4-ha78j_1Ebqzj2B83/view?usp=sharing | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://github.com/pengzhiliang/Conformer/blob/main/models.py#L73 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-small-p32_8xb128_in1k_20211206-947a0816.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-base-p16_3rdparty_8xb128_in1k_20211206-bfdf8637.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://drive.google.com/file/d/1oeQ9LSOGKEUaYGu7WTlUGl3KDsQIi0MA/view?usp=sharing | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/conformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://github.com/pengzhiliang/Conformer/blob/main/models.py#L89 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://arxiv.org/abs/2201.09792 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-768-32_3rdparty_10xb64_in1k_20220323-bca1f7b8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://github.com/tmp-iclr/convmixer/releases/download/v1.0/convmixer_768_32_ks7_p7_relu.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://github.com/locuslab/convmixer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-1024-20_3rdparty_10xb64_in1k_20220323-48f8aeba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://github.com/tmp-iclr/convmixer/releases/download/v1.0/convmixer_1024_20_ks9_p14.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://github.com/locuslab/convmixer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-1536_20_3rdparty_10xb64_in1k_20220323-ea5786f3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://github.com/tmp-iclr/convmixer/releases/download/v1.0/convmixer_1536_20_ks9_p7.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convmixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://github.com/locuslab/convmixer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545v1 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.20.1/mmcls/models/backbones/convnext.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-tiny_3rdparty_32xb128_in1k_20220124-18abde00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-tiny_3rdparty_32xb128-noema_in1k_20220222-2908964a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-small_3rdparty_32xb128_in1k_20220124-d39b5192.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-small_3rdparty_32xb128-noema_in1k_20220222-fa001ca5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_3rdparty_32xb128_in1k_20220124-d0915162.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_3rdparty_32xb128-noema_in1k_20220222-dba4f95f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_3rdparty_in21k_20220124-13b83eec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_in21k-pre-3rdparty_32xb128_in1k_20220124-eb2d6ada.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_3rdparty_64xb64_in1k_20220124-f8a0ded0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_3rdparty_in21k_20220124-41b5a79f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_in21k-pre-3rdparty_64xb64_in1k_20220124-2412403d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-xlarge_3rdparty_in21k_20220124-f909bad7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-xlarge_in21k-pre-3rdparty_64xb64_in1k_20220124-76b6863d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/convnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://github.com/facebookresearch/ConvNeXt | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.22.0/mmcls/models/backbones/cspnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/cspnet/cspdarknet50_3rdparty_8xb32_in1k_20220329-bd275287.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/rwightman/pytorch-image-models | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/cspnet/cspresnet50_3rdparty_8xb32_in1k_20220329-dd6dddfb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/rwightman/pytorch-image-models | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/cspnet/cspresnext50_3rdparty_8xb32_in1k_20220329-2cc84d21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/cspnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://github.com/rwightman/pytorch-image-models | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.19.0/mmcls/models/backbones/deit.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-tiny_pt-4xb256_in1k_20220218-13b382a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-tiny-distilled_3rdparty_pt-4xb256_in1k_20211216-c429839a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/facebookresearch/deit/blob/f5123946205daf72a88783dae94cabff98c49c55/models.py#L108 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-small_pt-4xb256_in1k_20220218-9425b9bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-small-distilled_3rdparty_pt-4xb256_in1k_20211216-4de1d725.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/facebookresearch/deit/blob/f5123946205daf72a88783dae94cabff98c49c55/models.py#L123 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base_pt-16xb64_in1k_20220216-db63c16c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base_3rdparty_pt-16xb64_in1k_20211124-6f40c188.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/facebookresearch/deit/blob/f5123946205daf72a88783dae94cabff98c49c55/models.py#L93 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base-distilled_3rdparty_pt-16xb64_in1k_20211216-42891296.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/facebookresearch/deit/blob/f5123946205daf72a88783dae94cabff98c49c55/models.py#L138 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base_3rdparty_ft-16xb32_in1k-384px_20211124-822d02f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/facebookresearch/deit/blob/f5123946205daf72a88783dae94cabff98c49c55/models.py#L153 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base-distilled_3rdparty_ft-16xb32_in1k-384px_20211216-e48d6000.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/deit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://github.com/facebookresearch/deit/blob/f5123946205daf72a88783dae94cabff98c49c55/models.py#L168 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://arxiv.org/abs/1608.06993 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet121_4xb256_in1k_20220426-07450f99.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet169_4xb256_in1k_20220426-a2889902.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet201_4xb256_in1k_20220426-05cae4ef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet161_4xb256_in1k_20220426-ee6a80a9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/densenet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://github.com/pytorch/vision/blob/main/torchvision/models/densenet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://arxiv.org/abs/1905.11946v5 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.20.1/mmcls/models/backbones/efficientnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32_in1k_20220119-a7e2a0b1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b0.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32-aa_in1k_20220119-8d939117.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b0.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32-aa-advprop_in1k_20220119-26434485.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b0.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32_in1k_20220119-002556d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b1.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32-aa_in1k_20220119-619d8ae3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b1.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32-aa-advprop_in1k_20220119-5715267d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b1.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32_in1k_20220119-ea374a30.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b2.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32-aa_in1k_20220119-dd61e80b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b2.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32-aa-advprop_in1k_20220119-1655338a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b2.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32_in1k_20220119-4b4d7487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b3.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa_in1k_20220119-5b4887a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b3.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa-advprop_in1k_20220119-53b41118.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b3.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32_in1k_20220119-81fd4077.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b4.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32-aa_in1k_20220119-45b8bd2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b4.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32-aa-advprop_in1k_20220119-38c2238c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b4.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32_in1k_20220119-e9814430.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b5.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32-aa_in1k_20220119-2cab8b78.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b5.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32-aa-advprop_in1k_20220119-f57a895a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b5.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b6_3rdparty_8xb32-aa_in1k_20220119-45b03310.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b6.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b6_3rdparty_8xb32-aa-advprop_in1k_20220119-bfe3485e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b6.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b7_3rdparty_8xb32-aa_in1k_20220119-bf03951c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b7.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b7_3rdparty_8xb32-aa-advprop_in1k_20220119-c6dbff10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b7.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b8_3rdparty_8xb32-aa-advprop_in1k_20220119-297ce1b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b8.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/efficientnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/fp16/resnet50_b32x8_fp16_dynamic_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/fp16/resnet50_b32x8_fp16_dynamic_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/fp16/resnet50_b32x8_fp16_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/fp16/resnet50_b32x8_fp16_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1908.07919v2 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.20.1/mmcls/models/backbones/hrnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w18_3rdparty_8xb32_in1k_20220120-0c10b180.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33cMkPimlmClRvmpw | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w30_3rdparty_8xb32_in1k_20220120-8aa3832f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33cQoACCEfrzcSaVI | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w32_3rdparty_8xb32_in1k_20220120-c394f1ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33dYBMemi9xOUFR0w | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w40_3rdparty_8xb32_in1k_20220120-9a2dbfc5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33ck0gvo5jfoWBOPo | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w44_3rdparty_8xb32_in1k_20220120-35d07f73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33czZQ0woUb980gRs | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w48_3rdparty_8xb32_in1k_20220120-e555ef50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33dKvqI6pBZlifgJk | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w64_3rdparty_8xb32_in1k_20220120-19126642.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://1drv.ms/u/s!Aus8VCZ_C_33gQbJsUPTIj3rQu99 | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w18_3rdparty_8xb32-ssld_in1k_20220120-455f69ea.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification/releases/download/PretrainedWeights/HRNet_W18_C_ssld_pretrained.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w48_3rdparty_8xb32-ssld_in1k_20220120-d0459c38.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification/releases/download/PretrainedWeights/HRNet_W48_C_ssld_pretrained.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/hrnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/HRNet/HRNet-Image-Classification | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://arxiv.org/abs/2105.01601 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.18.0/mmcls/models/backbones/mlp_mixer.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-base-p16_3rdparty_64xb64_in1k_20211124-1377e3e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/mlp_mixer.py#L70 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-large-p16_3rdparty_64xb64_in1k_20211124-5a2519d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mlp_mixer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/mlp_mixer.py#L73 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v2/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v2/metafile.yml | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v2/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v2/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/mobilenet_v2.py#L101 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v2/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v2/mobilenet_v2_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v2/mobilenet_v2_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/mobilenet_v3.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_small-8427ecf0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_large-3ea3c186.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/mobilenet_v3_large_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/mobilenet_v3_large_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/mobilenet_v3_small_cifar.py | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/mobilenet_v3_small_cifar.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/mobilenet_v3/mobilenet_v3_small_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/mobilenet_v3_small_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://arxiv.org/abs/2111.11418 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.22.1/mmcls/models/backbones/poolformer.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s12_3rdparty_32xb128_in1k_20220414-f8d83051.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer/releases/download/v1.0/poolformer_s12.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s24_3rdparty_32xb128_in1k_20220414-d7055904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer/releases/download/v1.0/poolformer_s24.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s36_3rdparty_32xb128_in1k_20220414-d78ff3e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer/releases/download/v1.0/poolformer_s36.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m36_3rdparty_32xb128_in1k_20220414-c55e0949.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer/releases/download/v1.0/poolformer_m36.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m48_3rdparty_32xb128_in1k_20220414-9378f3eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer/releases/download/v1.0/poolformer_m48.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/poolformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://github.com/sail-sg/poolformer | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.18.0/mmcls/models/backbones/regnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-400mf_8xb128_in1k_20211213-89bfc226.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-800mf_8xb128_in1k_20211213-222b0f11.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-1.6gf_8xb128_in1k_20211213-d1b89758.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-3.2gf_8xb64_in1k_20211213-1fdd82ae.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-4.0gf_8xb64_in1k_20211213-efed675c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-6.4gf_8xb64_in1k_20211215-5c6089da.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-8.0gf_8xb64_in1k_20211213-9a9fcc76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/regnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-12gf_8xb64_in1k_20211213-5df8c2f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://arxiv.org/abs/2105.01883 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.21.0/mmcls/models/backbones/repmlp.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repmlp/repmlp-base_3rdparty_8xb64_in1k_20220330-1cb1f11b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://github.com/DingXiaoH/RepMLP | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://github.com/DingXiaoH/RepMLP/blob/072d8516beba83d75dfe6ebb12f625abad4b53d5/repmlpnet.py#L274 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repmlp/repmlp-base_3rdparty_8xb64_in1k-256px_20220330-7c5a91ce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://github.com/DingXiaoH/RepMLP | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repmlp/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://github.com/DingXiaoH/RepMLP/blob/072d8516beba83d75dfe6ebb12f625abad4b53d5/repmlpnet.py#L278 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.16.0/mmcls/models/backbones/repvgg.py#L257 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A0_3rdparty_4xb64-coslr-120e_in1k_20210909-883ab98c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L196 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L200 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A2_3rdparty_4xb64-coslr-120e_in1k_20210909-97d7695a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L204 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B0_3rdparty_4xb64-coslr-120e_in1k_20210909-446375f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L208 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1_3rdparty_4xb64-coslr-120e_in1k_20210909-750cdf67.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L212 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g2_3rdparty_4xb64-coslr-120e_in1k_20210909-344f6422.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L216 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g4_3rdparty_4xb64-coslr-120e_in1k_20210909-d4c1a642.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L220 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2_3rdparty_4xb64-coslr-120e_in1k_20210909-bd6b937c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L225 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L229 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-dda968bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L238 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-4e54846a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L238 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-D2se_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-cf3139b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://drive.google.com/drive/folders/1Avome4KvNp0Lqh2QwhXO6L5URQjzCjUq | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/repvgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://github.com/DingXiaoH/RepVGG/blob/9f272318abfc47a2b702cd0e916fca8d25d683e7/repvgg.py#L250 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/pdf/1904.01169.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.17.0/mmcls/models/backbones/res2net.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w14-s8_3rdparty_8xb32_in1k_20210927-bc967bf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://1drv.ms/u/s!AkxDDnOtroRPdOTqhF8ne_aakDI?e=EVb8Ri | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://github.com/Res2Net/Res2Net-PretrainedModels/blob/master/res2net.py#L221 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w26-s8_3rdparty_8xb32_in1k_20210927-f547a94b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://1drv.ms/u/s!AkxDDnOtroRPdTrAd_Afzc26Z7Q?e=slYqsR | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://github.com/Res2Net/Res2Net-PretrainedModels/blob/master/res2net.py#L201 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmclassification/v0/res2net/res2net101-w26-s4_3rdparty_8xb32_in1k_20210927-870b6c36.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://1drv.ms/u/s!AkxDDnOtroRPcJRgTLkahL0cFYw?e=nwbnic | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/res2net/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://github.com/Res2Net/Res2Net-PretrainedModels/blob/master/res2net.py#L181 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnest/resnest50_b64x32_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnest/resnest50_b64x32_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnest/resnest101_b64x32_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnest/resnest101_b64x32_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnest/resnest200_b32x64_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnest/resnest200_b32x64_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnest/resnest269_b32x64_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnest/resnest269_b32x64_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/resnet.py#L383 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_b16x8_cifar10_20210528-bd6371c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_b16x8_cifar10_20210528-a8aa36a6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_b16x8_cifar10_20210528-f54bfad9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_b16x8_cifar10_20210528-2d29e936.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_b16x8_cifar10_20210528-3e8e9178.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_b16x8_cifar100_20210528-67b58a1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_8xb32_in1k_20210831-fbbb1da6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_8xb32_in1k_20210831-f257d4e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_8xb32_in1k_20210831-539c63f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_8xb32_in1k_20210901-4d7582fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_b32x8_imagenet_20210531-db14775a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_b32x8_imagenet_20210531-6e13bcd3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_b32x8_imagenet_20210531-278cf22a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/fp16/resnet50_batch256_fp16_imagenet_20210320-b3964210.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a2-300e_in1k_20211228-0fd8be6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a3-100e_in1k_20211228-3493673c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c50_8xb32_in1k_20220214-3343eccd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c101_8xb32_in1k_20220214-434fe45f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c152_8xb32_in1k_20220214-c013291f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_3rdparty-mill_in21k_20220331-faac000b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb8_cub_20220307-57840e60.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_3rdparty-mill_in21k_20220331-faac000b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb8_cars_20220812-9d85901a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet101_b16x8_cifar10.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet101_b16x8_cifar10.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet101_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet101_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet152_b16x8_cifar10.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet152_b16x8_cifar10.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet152_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet152_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet18_b16x8_cifar10.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet18_b16x8_cifar10.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet18_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet18_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet34_b16x8_cifar10.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet34_b16x8_cifar10.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet34_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet34_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_8xb8_cub.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_8xb8_cub.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_3rdparty-mill_in21k_20220331-faac000b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b16x8_cifar10.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b16x8_cifar10.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b16x8_cifar10_mixup.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b16x8_cifar10_mixup.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b16x8_cifar100.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b16x8_cifar100.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b32x8_coslr_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b32x8_coslr_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b32x8_cutmix_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b32x8_cutmix_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b32x8_label_smooth_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b32x8_label_smooth_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b32x8_mixup_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b32x8_mixup_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b64x32_warmup_coslr_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b64x32_warmup_coslr_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b64x32_warmup_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b64x32_warmup_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnet50_b64x32_warmup_label_smooth_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_b64x32_warmup_label_smooth_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnetv1d101_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnetv1d101_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnetv1d152_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnetv1d152_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnet/resnetv1d50_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnet/resnetv1d50_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2017/html/Xie_Aggregated_Residual_Transformations_CVPR_2017_paper.html | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/resnext.py#L90 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_b32x8_imagenet_20210506-23a247d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/resnext50_32x4d_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnext/resnext50_32x4d_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/resnext101_32x4d_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnext/resnext101_32x4d_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/resnext101_32x8d_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnext/resnext101_32x8d_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/resnext/resnext152_32x4d_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/resnext/resnext152_32x4d_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2018/html/Hu_Squeeze-and-Excitation_Networks_CVPR_2018_paper.html | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/seresnet.py#L58 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/seresnet50_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/seresnet/seresnet50_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/seresnet101_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/seresnet/seresnet101_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/seresnext50_32x4d_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/seresnet/seresnext50_32x4d_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/seresnet/seresnext101_32x4d_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/seresnet/seresnext101_32x4d_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v1/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v1/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2018/html/Zhang_ShuffleNet_An_Extremely_CVPR_2018_paper.html | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v1/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v1/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/shufflenet_v1.py#L152 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v1/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v1/metafile.yml | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v1/shufflenet_v1_1x_b64x16_linearlr_bn_nowd_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v1/shufflenet_v1_1x_b64x16_linearlr_bn_nowd_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v2/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v2/metafile.yml | https://openaccess.thecvf.com/content_ECCV_2018/papers/Ningning_Light-weight_CNN_Architecture_ECCV_2018_paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v2/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v2/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/shufflenet_v2.py#L134 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v2/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v2/metafile.yml | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/shufflenet_v2/shufflenet_v2_1x_b64x16_linearlr_bn_nowd_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/shufflenet_v2/shufflenet_v2_1x_b64x16_linearlr_bn_nowd_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://arxiv.org/pdf/2103.14030.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/swin_transformer.py#L176 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_tiny_224_b16x64_300e_imagenet_20210616_090925-66df6be6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_small_224_b16x64_300e_imagenet_20210615_110219-7f9d988b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_base_224_b16x64_300e_imagenet_20210616_190742-93230b0d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_tiny_patch4_window7_224-160bb0a5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_small_patch4_window7_224-cc7a01c9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window7_224-4670dd19.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window12_384-02c598a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window7_224_22kto1k-f967f799.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window12_384_22kto1k-d59b0d1d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window7_224_22kto1k-5f0996db.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window12_384_22kto1k-0a40944b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://github.com/microsoft/Swin-Transformer/blob/777f6c66604bb5579086c4447efe3620344d95a9/models/swin_transformer.py#L458 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin-large_3rdparty_in21k-384px.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin-large_8xb8_cub_384px_20220307-1bbaee6a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin_base_224_b16x64_300e_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin_base_224_b16x64_300e_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin_base_384_evalonly_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin_base_384_evalonly_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin_large_224_evalonly_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin_large_224_evalonly_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin_large_384_evalonly_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin_large_384_evalonly_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin_small_224_b16x64_300e_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin_small_224_b16x64_300e_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin_tiny_224_b16x64_300e_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin_tiny_224_b16x64_300e_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/swin_transformer/swin-large_8xb8_cub_384px.py | Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin-large_8xb8_cub_384px.py | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin-large_3rdparty_in21k-384px.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/t2t_vit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://arxiv.org/abs/2101.11986 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/t2t_vit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.17.0/mmcls/models/backbones/t2t_vit.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/t2t_vit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-14_8xb64_in1k_20211220-f7378dd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/t2t_vit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-19_8xb64_in1k_20211214-7f5e3aaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/t2t_vit/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-24_8xb64_in1k_20211214-b2a68ae3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/tnt/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/tnt/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/tnt.py#L203 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/tnt/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://download.openmmlab.com/mmclassification/v0/tnt/tnt-small-p16_3rdparty_in1k_20210903-c56ee7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/tnt/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/tnt/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://github.com/contrastive/pytorch-image-models/blob/809271b0f3e5d9be4e11c0c5cec1dbba8b5e2c60/timm/models/tnt.py#L144 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/tnt/tnt_s_patch16_224_evalonly_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/tnt/tnt_s_patch16_224_evalonly_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | http://arxiv-export-lb.library.cornell.edu/abs/2104.13840 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.20.1/mmcls/models/backbones/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-small_3rdparty_8xb128_in1k_20220126-ef23c132.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-base_3rdparty_8xb128_in1k_20220126-f8c4b0d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-large_3rdparty_16xb64_in1k_20220126-c1ef8d80.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-small_3rdparty_8xb128_in1k_20220126-8fe5205b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-base_3rdparty_8xb128_in1k_20220126-e31cc8e9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-large_3rdparty_16xb64_in1k_20220126-4817645f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/twins/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/twins.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/van/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://arxiv.org/pdf/2202.09741v2.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/van/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.23.0/mmcls/models/backbones/van.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/van/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-tiny_8xb128_in1k_20220501-385941af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/van/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-small_8xb128_in1k_20220501-17bc91aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/van/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-base_8xb128_in1k_20220501-6a4cc31b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/van/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-large_8xb128_in1k_20220501-f212ba21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://arxiv.org/abs/1409.1556 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.15.0/mmcls/models/backbones/vgg.py#L39 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_batch256_imagenet_20210208-4271cd6c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_batch256_imagenet_20210208-4d1d6080.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_batch256_imagenet_20210208-e6920e4a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_batch256_imagenet_20210207-f244902c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_batch256_imagenet_20210208-da620c4f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg11_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg11_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg11bn_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg11bn_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg13_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg13_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg13bn_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg13bn_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg16_8xb16_voc.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg16_8xb16_voc.py | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg16_b16x8_voc.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg16_b16x8_voc.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg16_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg16_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg16bn_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg16bn_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg19_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg19_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vgg/vgg19bn_b32x8_imagenet.py | Resnet50_Cifar_for_PyTorch/configs/vgg/vgg19bn_b32x8_imagenet.py | https://github.com/open-mmlab/mmclassification/pull/508 | 参考代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/vision_transformer/vit-base-p16_ft-4xb544-ipu_in1k.py | Resnet50_Cifar_for_PyTorch/configs/vision_transformer/vit-base-p16_ft-4xb544-ipu_in1k.py | https://download.openmmlab.com/mmclassification/v0/vit/pretrain/vit-base-p16_3rdparty_pt-64xb64_in1k-224_20210928-02284250.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://arxiv.org/abs/1605.07146 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://github.com/open-mmlab/mmclassification/blob/v0.20.1/mmcls/models/backbones/resnet.py#L383 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet50_3rdparty_8xb32_in1k_20220304-66678344.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet101_3rdparty_8xb32_in1k_20220304-8d5f9d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://github.com/pytorch/vision/blob/main/torchvision/models/resnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet50_3rdparty-timm_8xb32_in1k_20220304-83ae4399.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/configs/wrn/metafile.yml | Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/resnet.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/docker/Dockerfile | Resnet50_Cifar_for_PyTorch/docker/Dockerfile | https://github.com/open-mmlab/mmclassification.git | 开源代码仓说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/docker/serve/Dockerfile | Resnet50_Cifar_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 指定下载链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmcls/datasets/cifar.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmcls/datasets/cifar.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmcls/datasets/mnist.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/ | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmcls/datasets/mnist.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/ | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/setup.py | Resnet50_Cifar_for_PyTorch/setup.py | https://github.com/open-mmlab/mmclassification | 开源代码仓说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/setup.py | Resnet50_Cifar_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/tools/deployment/onnx2tensorrt.py | Resnet50_Cifar_for_PyTorch/tools/deployment/onnx2tensorrt.py | https://github.com/open-mmlab/mmdeploy | 开源代码仓说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/tools/deployment/pytorch2onnx.py | Resnet50_Cifar_for_PyTorch/tools/deployment/pytorch2onnx.py | https://github.com/open-mmlab/mmdeploy | 开源代码仓说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/tools/deployment/test.py | Resnet50_Cifar_for_PyTorch/tools/deployment/test.py | https://github.com/open-mmlab/mmdeploy | 开源代码仓说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/optimizers/lamb.py | Resnet50_Cifar_for_PyTorch/mmcls/core/optimizers/lamb.py | https://arxiv.org/abs/1904.00962 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/seresnet.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/resnet.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/core/visualization/image.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/transforms/processing.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/pipelines/transforms.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/cub.py | http://www.vision.caltech.edu/visipedia/CUB-200-2011.html | 数据集地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/apis/train.py | https://github.com/open-mmlab/mmdetection/issues/6339 | 相关说明 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/pipelines/transforms.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/preprocessing.py#L118 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/configs/mobilenet_v3/mobilenet_v3_small_8xb128_in1k.py | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/mobilenet-v3-large_8xb32_in1k.py | https://pytorch.org/blog/ml-models-torchvision-v0.9/#classification | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/hooks/precise_bn_hook.py | Resnet50_Cifar_for_PyTorch/mmcls/core/hook/precise_bn_hook.py | https://arxiv.org/abs/2105.07576 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcv_need/setup.py | https://github.com/open-mmlab/mmcv/pull/1463 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/tests/test_models/test_backbones/utils.py | Resnet50_Cifar_for_PyTorch/tests/test_models/test_backbones/utils.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/utils/position_encoding.py | Resnet50_Cifar_for_PyTorch/mmcls/models/utils/position_encoding.py | https://arxiv.org/abs/2102.10882 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/densenet.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/van.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/van.py | https://github.com/Visual-Attention-Network/VAN-Classification | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/losses/asymmetric_loss.py | https://arxiv.org/abs/2009.14119 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/xcit.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/t2t_vit.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/optimizers/lamb.py | Resnet50_Cifar_for_PyTorch/mmcls/core/optimizers/lamb.py | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/pipelines/transforms.py | https://github.com/kakaobrain/fast-autoaugment/blob/master/FastAutoAugment/data.py | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/utils/setup_env.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/mnist.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/mnist.py | https://github.com/zalandoresearch/fashion-mnist | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/dataset_wrappers.py | https://arxiv.org/pdf/1908.03195.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/hrnet.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/mlp_mixer.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/mlp_mixer.py | https://arxiv.org/pdf/2105.01601.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/cifar.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/cifar.py | https://github.com/pytorch/vision/blob/master/torchvision/datasets/cifar.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/docs/zh_CN/locales/zh_CN/LC_MESSAGES/api.po | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/convnext.py | https://arxiv.org/pdf/2201.03545.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/densenet.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/core/hook/wandblogger_hook.py | https://docs.wandb.ai/ref/python/init | 相关说明 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/seresnext.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/resnext.py | https://arxiv.org/abs/1611.05431 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/docs/zh_CN/user_guides/dataset_prepare.md | Resnet50_Cifar_for_PyTorch/mmcls/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/ | 数据集地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/configs/mobilenet_v3/mobilenet_v3_small_8xb128_in1k.py | Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/mobilenet-v3-small_8xb32_in1k.py | https://pytorch.org/blog/ml-models-torchvision-v0.9/#classification | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/setup.py | Resnet50_Cifar_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/transforms/processing.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/convmixer.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/convmixer.py | https://arxiv.org/pdf/2201.09792.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/docs/zh_CN/user_guides/dataset_prepare.md | Resnet50_Cifar_for_PyTorch/mmcls/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar.html | 数据集地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/optimizers/lamb.py | Resnet50_Cifar_for_PyTorch/mmcls/core/optimizers/lamb.py | https://github.com/HabanaAI/Model-References/blob/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/utils/helpers.py | Resnet50_Cifar_for_PyTorch/mmcls/models/utils/helpers.py | https://github.com/pytorch/pytorch/issues/42448 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/transforms/auto_augment.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/pipelines/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/artifacts/model-versioning | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/setup.py | Resnet50_Cifar_for_PyTorch/mmcv_need/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/core/hook/precise_bn_hook.py | https://github.com/facebookresearch/pycls/blob/f8cd962737e33ce9e19b3083a33551da95c2d9c0/pycls/core/net.py | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/lenet.py | https://en.wikipedia.org/wiki/LeNet | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/optimizers/lamb.py | Resnet50_Cifar_for_PyTorch/mmcls/core/optimizers/lamb.py | https://github.com/pytorch/pytorch/issues/9190 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/apis/model.py | Resnet50_Cifar_for_PyTorch/mmcls/apis/inference.py | https://github.com/open-mmlab/mmdetection/pull/6405 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/twins.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/twins.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/data-vis | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/densenet.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/densenet.py | https://github.com/liuzhuang13/DenseNet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/convnext.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/convnext.py | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/twins.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/resnet.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/transforms/auto_augment.py | Resnet50_Cifar_for_PyTorch/configs/_base_/datasets/pipelines/auto_aug.py | https://github.com/DeepVoltaire/AutoAugment/blame/master/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/tnt.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/tnt.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/tnt.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/optimizers/lamb.py | Resnet50_Cifar_for_PyTorch/mmcls/core/optimizers/lamb.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/convmixer.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/convmixer.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/convmixer.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/poolformer.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/poolformer.py | https://github.com/sail-sg/poolformer/blob/main/models/poolformer.py | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/imagenet.py | http://www.image-net.org | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/timm_backbone.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/timm_backbone.py | https://github.com/rwightman/pytorch-image-models/issues/488 | 相关说明 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/losses/label_smooth_loss.py | Resnet50_Cifar_for_PyTorch/mmcls/models/losses/label_smooth_loss.py | https://arxiv.org/abs/1512.00567 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/alexnet.py | https://en.wikipedia.org/wiki/AlexNet | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/transforms/auto_augment.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/pipelines/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/utils/batch_augments/mixup.py | Resnet50_Cifar_for_PyTorch/mmcls/models/utils/augment/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcv_need/setup.py | https://github.com/pytorch/pytorch/pull/45956 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/mnist.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/mnist.py | https://github.com/pytorch/vision/blob/master/torchvision/datasets/mnist.py | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/apis/train.py | https://github.com/open-mmlab/mmcv/issues/1261 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/convmixer.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/convmixer.py | https://github.com/locuslab/convmixer/blob/main/convmixer.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/docs/zh_CN/locales/zh_CN/LC_MESSAGES/papers.po | Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_8xb8_cub.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/resnest.py | https://arxiv.org/pdf/2004.08955.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/utils/batch_augments/resizemix.py | Resnet50_Cifar_for_PyTorch/mmcls/models/utils/augment/resizemix.py | https://arxiv.org/abs/2012.11101 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/convnext.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/convnext.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/convnext.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/resnet_cifar.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/resnet_cifar.py | https://github.com/kuangliu/pytorch-cifar/blob/master/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/timm_backbone.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/timm_backbone.py | https://rwightman.github.io/pytorch-image-models/feature_extraction/ | 相关说明 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/datasets/cub.py | http://www.vision.caltech.edu/visipedia/CUB-200.html | 数据集地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcv_need/setup.py | https://github.com/open-mmlab/mmcv | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/core/visualization/image.py | https://github.com/matplotlib/matplotlib/blob/v3.5.x/lib/matplotlib/_blocking_input.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/utils/batch_augments/cutmix.py | Resnet50_Cifar_for_PyTorch/mmcls/models/utils/augment/cutmix.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/datasets/samplers/repeat_aug.py | Resnet50_Cifar_for_PyTorch/mmcls/datasets/samplers/repeat_aug.py | https://github.com/facebookresearch/deit/blob/0c4b8f60/samplers.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/engine/optimizers/lamb.py | Resnet50_Cifar_for_PyTorch/mmcls/core/optimizers/lamb.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/optim/lamb.py | 源码实现 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/mmcls/utils/distribution.py | https://pytorch.org/docs/stable/generated/torch.nn.parallel | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/selfsup/mocov3.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/losses/seesaw_loss.py | Resnet50_Cifar_for_PyTorch/mmcls/models/losses/seesaw_loss.py | https://arxiv.org/abs/2008.10032 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpretrain/blob/main/mmpretrain/models/backbones/swin_transformer.py | Resnet50_Cifar_for_PyTorch/mmcls/models/backbones/swin_transformer.py | https://arxiv.org/abs/2103.14030 | 论文地址 | -| 开发引入 | / | Resnet50_Cifar_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-tiny-p16_3rdparty_8xb128_in1k_20211206-f6860372.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-small-p32_8xb128_in1k_20211206-947a0816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-small-p16_3rdparty_8xb128_in1k_20211206-3065dcf5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/conformer/conformer-base-p16_3rdparty_8xb128_in1k_20211206-bfdf8637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/conformer/metafile.yml | https://arxiv.org/abs/2105.03889 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-768-32_3rdparty_10xb64_in1k_20220323-bca1f7b8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-1536_20_3rdparty_10xb64_in1k_20220323-ea5786f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convmixer/convmixer-1024-20_3rdparty_10xb64_in1k_20220323-48f8aeba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convmixer/metafile.yml | https://arxiv.org/abs/2201.09792 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-xlarge_in21k-pre-3rdparty_64xb64_in1k_20220124-76b6863d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-xlarge_3rdparty_in21k_20220124-f909bad7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-tiny_3rdparty_32xb128-noema_in1k_20220222-2908964a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-tiny_3rdparty_32xb128_in1k_20220124-18abde00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-small_3rdparty_32xb128-noema_in1k_20220222-fa001ca5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-small_3rdparty_32xb128_in1k_20220124-d39b5192.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_in21k-pre-3rdparty_64xb64_in1k_20220124-2412403d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_3rdparty_in21k_20220124-41b5a79f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-large_3rdparty_64xb64_in1k_20220124-f8a0ded0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_in21k-pre-3rdparty_32xb128_in1k_20220124-eb2d6ada.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_3rdparty_in21k_20220124-13b83eec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_3rdparty_32xb128-noema_in1k_20220222-dba4f95f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/convnext/convnext-base_3rdparty_32xb128_in1k_20220124-d0915162.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545v1 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/cspnet/cspresnext50_3rdparty_8xb32_in1k_20220329-2cc84d21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/cspnet/cspresnet50_3rdparty_8xb32_in1k_20220329-dd6dddfb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/cspnet/cspdarknet50_3rdparty_8xb32_in1k_20220329-bd275287.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/cspnet/metafile.yml | https://arxiv.org/abs/1911.11929 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-tiny-distilled_3rdparty_pt-4xb256_in1k_20211216-c429839a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-tiny_pt-4xb256_in1k_20220218-13b382a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-small-distilled_3rdparty_pt-4xb256_in1k_20211216-4de1d725.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-small_pt-4xb256_in1k_20220218-9425b9bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base-distilled_3rdparty_pt-16xb64_in1k_20211216-42891296.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base-distilled_3rdparty_ft-16xb32_in1k-384px_20211216-e48d6000.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base_pt-16xb64_in1k_20220216-db63c16c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base_3rdparty_pt-16xb64_in1k_20211124-6f40c188.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/deit/deit-base_3rdparty_ft-16xb32_in1k-384px_20211124-822d02f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/deit/metafile.yml | https://arxiv.org/abs/2012.12877 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet201_4xb256_in1k_20220426-05cae4ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet169_4xb256_in1k_20220426-a2889902.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet161_4xb256_in1k_20220426-ee6a80a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/densenet/densenet121_4xb256_in1k_20220426-07450f99.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/densenet/metafile.yml | https://arxiv.org/abs/1608.06993 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b8_3rdparty_8xb32-aa-advprop_in1k_20220119-297ce1b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b7_3rdparty_8xb32-aa-advprop_in1k_20220119-c6dbff10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b7_3rdparty_8xb32-aa_in1k_20220119-bf03951c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b6_3rdparty_8xb32-aa-advprop_in1k_20220119-bfe3485e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b6_3rdparty_8xb32-aa_in1k_20220119-45b03310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32-aa-advprop_in1k_20220119-f57a895a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32-aa_in1k_20220119-2cab8b78.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b5_3rdparty_8xb32_in1k_20220119-e9814430.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32-aa-advprop_in1k_20220119-38c2238c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32-aa_in1k_20220119-45b8bd2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b4_3rdparty_8xb32_in1k_20220119-81fd4077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa-advprop_in1k_20220119-53b41118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa_in1k_20220119-5b4887a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32_in1k_20220119-4b4d7487.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32-aa-advprop_in1k_20220119-1655338a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32-aa_in1k_20220119-dd61e80b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b2_3rdparty_8xb32_in1k_20220119-ea374a30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32-aa-advprop_in1k_20220119-5715267d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32-aa_in1k_20220119-619d8ae3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b1_3rdparty_8xb32_in1k_20220119-002556d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32-aa-advprop_in1k_20220119-26434485.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32-aa_in1k_20220119-8d939117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b0_3rdparty_8xb32_in1k_20220119-a7e2a0b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b7.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b6.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b5.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b4.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b3.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b2.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b1.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckptsaug/efficientnet-b0.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b5.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b4.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b3.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b2.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b1.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/ckpts/efficientnet-b0.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b8.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b7.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b6.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b5.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b4.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b3.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b2.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b1.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://storage.googleapis.com/cloud-tpu-checkpoints/efficientnet/advprop/efficientnet-b0.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/efficientnet/metafile.yml | https://arxiv.org/abs/1905.11946v5 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w64_3rdparty_8xb32_in1k_20220120-19126642.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w48_3rdparty_8xb32-ssld_in1k_20220120-d0459c38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w48_3rdparty_8xb32_in1k_20220120-e555ef50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w44_3rdparty_8xb32_in1k_20220120-35d07f73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w40_3rdparty_8xb32_in1k_20220120-9a2dbfc5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w32_3rdparty_8xb32_in1k_20220120-c394f1ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w30_3rdparty_8xb32_in1k_20220120-8aa3832f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w18_3rdparty_8xb32-ssld_in1k_20220120-455f69ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/hrnet/hrnet-w18_3rdparty_8xb32_in1k_20220120-0c10b180.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1908.07919v2 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-large-p16_3rdparty_64xb64_in1k_20211124-5a2519d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mlp-mixer/mixer-base-p16_3rdparty_64xb64_in1k_20211124-1377e3e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mlp_mixer/metafile.yml | https://arxiv.org/abs/2105.01601 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mobilenet_v2/metafile.yml | https://arxiv.org/abs/1801.04381 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_small-8427ecf0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_large-3ea3c186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/mobilenet_v3/metafile.yml | https://arxiv.org/abs/1905.02244 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s36_3rdparty_32xb128_in1k_20220414-d78ff3e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s24_3rdparty_32xb128_in1k_20220414-d7055904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s12_3rdparty_32xb128_in1k_20220414-f8d83051.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m48_3rdparty_32xb128_in1k_20220414-9378f3eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m36_3rdparty_32xb128_in1k_20220414-c55e0949.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/poolformer/metafile.yml | https://arxiv.org/abs/2111.11418 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-800mf_8xb128_in1k_20211213-222b0f11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-8.0gf_8xb64_in1k_20211213-9a9fcc76.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-6.4gf_8xb64_in1k_20211215-5c6089da.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-400mf_8xb128_in1k_20211213-89bfc226.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-4.0gf_8xb64_in1k_20211213-efed675c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-3.2gf_8xb64_in1k_20211213-1fdd82ae.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-12gf_8xb64_in1k_20211213-5df8c2f8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/regnet/regnetx-1.6gf_8xb128_in1k_20211213-d1b89758.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repmlp/repmlp-base_3rdparty_8xb64_in1k-256px_20220330-7c5a91ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repmlp/repmlp-base_3rdparty_8xb64_in1k_20220330-1cb1f11b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repmlp/metafile.yml | https://arxiv.org/abs/2105.01883 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-D2se_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-cf3139b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-4e54846a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-dda968bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2_3rdparty_4xb64-coslr-120e_in1k_20210909-bd6b937c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g4_3rdparty_4xb64-coslr-120e_in1k_20210909-d4c1a642.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g2_3rdparty_4xb64-coslr-120e_in1k_20210909-344f6422.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1_3rdparty_4xb64-coslr-120e_in1k_20210909-750cdf67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B0_3rdparty_4xb64-coslr-120e_in1k_20210909-446375f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A2_3rdparty_4xb64-coslr-120e_in1k_20210909-97d7695a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A0_3rdparty_4xb64-coslr-120e_in1k_20210909-883ab98c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/repvgg/metafile.yml | https://arxiv.org/abs/2101.03697 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w26-s8_3rdparty_8xb32_in1k_20210927-f547a94b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w14-s8_3rdparty_8xb32_in1k_20210927-bc967bf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmclassification/v0/res2net/res2net101-w26-s4_3rdparty_8xb32_in1k_20210927-870b6c36.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/pdf/1904.01169.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_b32x8_imagenet_20210531-db14775a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_b32x8_imagenet_20210531-278cf22a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_b32x8_imagenet_20210531-6e13bcd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c50_8xb32_in1k_20220214-3343eccd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c152_8xb32_in1k_20220214-c013291f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1c101_8xb32_in1k_20220214-434fe45f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_b16x8_cifar100_20210528-67b58a1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_b16x8_cifar10_20210528-f54bfad9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb8_cub_20220307-57840e60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb8_cars_20220812-9d85901a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a3-100e_in1k_20211228-3493673c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a2-300e_in1k_20211228-0fd8be6e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_b16x8_cifar10_20210528-a8aa36a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_8xb32_in1k_20210831-f257d4e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_b16x8_cifar10_20210528-bd6371c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_8xb32_in1k_20210831-fbbb1da6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_b16x8_cifar10_20210528-3e8e9178.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_8xb32_in1k_20210901-4d7582fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_b16x8_cifar10_20210528-2d29e936.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_8xb32_in1k_20210831-539c63f8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/fp16/resnet50_batch256_fp16_imagenet_20210320-b3964210.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_3rdparty-mill_in21k_20220331-faac000b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_3rdparty-mill_in21k_20220331-faac000b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnet/resnet50_8xb8_cub.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_3rdparty-mill_in21k_20220331-faac000b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_b32x8_imagenet_20210506-23a247d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/resnext/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2017/html/Xie_Aggregated_Residual_Transformations_CVPR_2017_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/seresnet/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2018/html/Hu_Squeeze-and-Excitation_Networks_CVPR_2018_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/shufflenet_v1/metafile.yml | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/shufflenet_v1/metafile.yml | https://openaccess.thecvf.com/content_cvpr_2018/html/Zhang_ShuffleNet_An_Extremely_CVPR_2018_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/shufflenet_v2/metafile.yml | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/shufflenet_v2/metafile.yml | https://openaccess.thecvf.com/content_ECCV_2018/papers/Ningning_Light-weight_CNN_Architecture_ECCV_2018_paper.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin-large_8xb8_cub_384px_20220307-1bbaee6a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_tiny_224_b16x64_300e_imagenet_20210616_090925-66df6be6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_small_224_b16x64_300e_imagenet_20210615_110219-7f9d988b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_base_224_b16x64_300e_imagenet_20210616_190742-93230b0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_tiny_patch4_window7_224-160bb0a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_small_patch4_window7_224-cc7a01c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window7_224_22kto1k-5f0996db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window12_384_22kto1k-0a40944b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window7_224-4670dd19.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window7_224_22kto1k-f967f799.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window12_384-02c598a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window12_384_22kto1k-d59b0d1d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://arxiv.org/pdf/2103.14030.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/metafile.yml | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin-large_3rdparty_in21k-384px.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/swin_transformer/swin-large_8xb8_cub_384px.py | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin-large_3rdparty_in21k-384px.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-24_8xb64_in1k_20211214-b2a68ae3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-19_8xb64_in1k_20211214-7f5e3aaf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-14_8xb64_in1k_20211220-f7378dd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/t2t_vit/metafile.yml | https://arxiv.org/abs/2101.11986 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://download.openmmlab.com/mmclassification/v0/tnt/tnt-small-p16_3rdparty_in1k_20210903-c56ee7df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/tnt/metafile.yml | https://arxiv.org/abs/2103.00112 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-small_3rdparty_8xb128_in1k_20220126-8fe5205b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-large_3rdparty_16xb64_in1k_20220126-4817645f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-svt-base_3rdparty_8xb128_in1k_20220126-e31cc8e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-small_3rdparty_8xb128_in1k_20220126-ef23c132.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-large_3rdparty_16xb64_in1k_20220126-c1ef8d80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | https://download.openmmlab.com/mmclassification/v0/twins/twins-pcpvt-base_3rdparty_8xb128_in1k_20220126-f8c4b0d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/twins/metafile.yml | http://arxiv-export-lb.library.cornell.edu/abs/2104.13840 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-tiny_8xb128_in1k_20220501-385941af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-small_8xb128_in1k_20220501-17bc91aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-large_8xb128_in1k_20220501-f212ba21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://download.openmmlab.com/mmclassification/v0/van/van-base_8xb128_in1k_20220501-6a4cc31b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/van/metafile.yml | https://arxiv.org/pdf/2202.09741v2.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_batch256_imagenet_20210208-da620c4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_batch256_imagenet_20210208-e6920e4a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_batch256_imagenet_20210208-4d1d6080.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_batch256_imagenet_20210207-f244902c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_batch256_imagenet_20210208-4271cd6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/metafile.yml | https://arxiv.org/abs/1409.1556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vgg/vgg16_8xb16_voc.py | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/vision_transformer/vit-base-p16_ft-4xb544-ipu_in1k.py | https://download.openmmlab.com/mmclassification/v0/vit/pretrain/vit-base-p16_3rdparty_pt-64xb64_in1k-224_20210928-02284250.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet50_3rdparty-timm_8xb32_in1k_20220304-83ae4399.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet50_3rdparty_8xb32_in1k_20220304-66678344.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://download.openmmlab.com/mmclassification/v0/wrn/wide-resnet101_3rdparty_8xb32_in1k_20220304-8d5f9d61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/configs/wrn/metafile.yml | https://arxiv.org/abs/1605.07146 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/mmcls/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/mmcls/datasets/cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/mmcls/datasets/mnist.py | http://yann.lecun.com/exdb/mnist/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/mmcls/datasets/mnist.py | http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/mmcv_need/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Resnet50_Cifar_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/public_address_statement.md index fee2cd6d4c98ffef709fd8478c504fcdfce3b45c..ae392bbfed543b2c79722111abe533b7d6f227df 100644 --- a/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/public_address_statement.md @@ -1,6 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 下载权重文件 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/8p_main_med.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/modelarts/8p_main_med.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/url.ini | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/classification/Shufflenetv2_for_PyTorch/url.ini | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/public_address_statement.md index dec22816449c9770a82f68e0ee6ac8c369acb7b0..ab6a66a24047724861c8f7ea383b2a62fa3ab3ca 100644 --- a/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/public_address_statement.md @@ -1,52 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |----------------------------------------------------------------------------------------------------------------------| ------ | ------------------------------------ |---------| -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/core/visualize.py | ABINet_for_PyTorch/mmocr/core/visualize.py | https://download.openmmlab.com/mmocr/data/font.TTF | 下载字体文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/datasets/utils/parser.py | ABINet_for_PyTorch/mmocr/datasets/utils/parser.py | https://mmocr.readthedocs.io/en/latest/ | 相关文档 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/requirements/docs.txt | ABINet_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 下载安装包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/toosls/deployment/deploy_test.py | ABINet_for_PyTorch/toosls/deployment/deploy_test.py | https://github.com/open-mmlab/mmdeploy | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/toosls/deployment/onnx2tensorrt.py | ABINet_for_PyTorch/toosls/deployment/onnx2tensorrt.py | https://github.com/open-mmlab/mmdeploy | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/toosls/deployment/pytorch2onnx.py | ABINet_for_PyTorch/toosls/deployment/pytorch2onnx.py | https://github.com/open-mmlab/mmdeploy | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/nrtr.py | ABINet_for_PyTorch/mmocr/models/textrecog/recognizer/nrtr.py | https://arxiv.org/pdf/1806.00926.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/encoders/transformer.py | ABINet_for_PyTorch/mmocr/datasets/pipelines/transforms.py | https://github.com/FangShancheng/ABINet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/position_attention_decoder.py | https://arxiv.org/abs/2007.07542 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/satrn.py | ABINet_for_PyTorch/mmocr/models/textrecog/encoders/abinet_vision_model.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/utils/model.py | ABINet_for_PyTorch/mmocr/utils/model.py | https://github.com/pytorch/pytorch/issues/41081#issuecomment-783961547 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | ABINet_for_PyTorch/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | https://arxiv.org/pdf/1603.03915.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/decoders/abinet_vision_decoder.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/abinet_vision_decoder.py | https://arxiv.org/pdf/2103.06495 | 论文地址 | -| 开发引入 | / | ABINet_for_PyTorch/mmocr/core/evaluation/ocr_metric.py | https://rrc.cvc.uab.es/?ch=14&com=tasks | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/common/modules/transformer_module.py | ABINet_for_PyTorch/mmocr/models/common/modules/transformer_module.py | https://github.com/jadore801120/attention-is-all-you-need-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/sar.py | ABINet_for_PyTorch/mmocr/models/textrecog/recognizer/sar.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/decoders/abinet_vision_decoder.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/abinet_language_decoder.py | https://arxiv.org/pdf/2103.06495 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/backbones/resnet31_ocr.py | ABINet_for_PyTorch/mmocr/models/textrecog/backbones/resnet_abi.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/satrn.py | ABINet_for_PyTorch/mmocr/models/textrecog/fusers/abi_fuser.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/robust_scanner_decoder.py | https://arxiv.org/abs/2007.07542 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/satrn.py | ABINet_for_PyTorch/mmocr/models/textrecog/layers/satrn_layers.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/abinet.py | ABINet_for_PyTorch/mmocr/models/textrecog/recognizer/abinet.py | https://arxiv.org/pdf/2103.06495.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | ABINet_for_PyTorch/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | https://github.com/clovaai/deep-text-recognition-benchmark | 源码实现 | -| 开发引入 | / | ABINet_for_PyTorch/mmcv_need/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开发引入 | / | ABINet_for_PyTorch/mmocr/utils/setup_env.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/setup.py | ABINet_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/layers/satrn_layers.py | ABINet_for_PyTorch/mmocr/models/textrecog/layers/satrn_layers.py | https://github.com/Media-Smart/vedastr | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/decoders/master_decoder.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/master_decoder.py | https://github.com/wenwenyu/MASTER-pytorch | 源码实现 | -| 开发引入 | / | ABINet_for_PyTorch/mmocr/apis/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/satrn.py | ABINet_for_PyTorch/mmocr/models/textrecog/recognizer/satrn.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/satrn.py | ABINet_for_PyTorch/mmocr/models/textrecog/encoders/satrn_encoder.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/utils/box_util.py | ABINet_for_PyTorch/mmocr/utils/box_util.py | https://github.com/novioleo/Savior/blob/master/Utils/GeometryUtils.py | 源码实现 | -| 开发引入 | / | ABINet_for_PyTorch/mmocr/datasets/utils/backend.py | https://mmocr.readthedocs.io/en/latest/tools.html# | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/master.py | ABINet_for_PyTorch/mmocr/models/textrecog/recognizer/master.py | https://arxiv.org/abs/1910.02562 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/backbones/very_deep_vgg.py | ABINet_for_PyTorch/mmocr/models/textrecog/backbones/very_deep_vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/apis/train.py | ABINet_for_PyTorch/mmocr/apis/train.py | https://github.com/open-mmlab/mmdetection/issues/6339 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/robust_scanner.py | ABINet_for_PyTorch/mmocr/models/textrecog/recognizer/robust_scanner.py | https://arxiv.org/pdf/2007.07542.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/sar.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/sar_decoder.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/backbones/resnet31_ocr.py | ABINet_for_PyTorch/mmocr/models/textrecog/backbones/resnet31_ocr.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/utils/box_util.py | ABINet_for_PyTorch/mmocr/utils/box_util.py | https://github.com/faustomorales/keras-ocr/issues/22 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/common/backbones/unet.py | ABINet_for_PyTorch/mmocr/models/common/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/encoders/transformer.py | ABINet_for_PyTorch/mmocr/models/textrecog/encoders/transformer.py | https://github.com/FangShancheng/ABINet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/sar.py | ABINet_for_PyTorch/mmocr/models/textrecog/losses/ce_loss.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/encoders/transformer.py | ABINet_for_PyTorch/mmocr/models/textrecog/backbones/resnet_abi.py | https://github.com/FangShancheng/ABINet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/master.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/master_decoder.py | https://arxiv.org/abs/1910.02562 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/backbones/shallow_cnn.py | ABINet_for_PyTorch/mmocr/models/textrecog/backbones/shallow_cnn.py | https://arxiv.org/pdf/1910.04396.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/recognizer/sar.py | ABINet_for_PyTorch/mmocr/models/textrecog/encoders/sar_encoder.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/a4fe6bb67f066bbb5023e38d404c1210b1b3bab2/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | ABINet_for_PyTorch/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | https://arxiv.org/abs/2007.07542 | 论文地址 | -| 开发引入 | / | ABINet_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | -| 开发引入 | / | ABINet_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------|----------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/mmocr/core/visualize.py | https://download.openmmlab.com/mmocr/data/font.TTF | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/mmocr/datasets/utils/backend.py | https://mmocr.readthedocs.io/en/latest/tools.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/ABINet_for_PyTorch/mmocr/datasets/utils/parser.py | https://mmocr.readthedocs.io/en/latest/ | 相关文档 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/DAL_ID2732_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/DAL_ID2732_for_PyTorch/public_address_statement.md index 6682049ada5c9505b1c560eeabfe7932922a2e0b..fa7c808b8ea0240d1b7df4db0a2e22b9751a78a1 100644 --- a/PyTorch/built-in/cv/detection/DAL_ID2732_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/DAL_ID2732_for_PyTorch/public_address_statement.md @@ -1,22 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |---------------------------------------------------------------------------------------| ------ | ------------------------------------ |------| -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/DOTA2COCO.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/DOTA2COCO.py | http://captain.whu.edu.cn/DOTAweb/ | 文档地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/augment.py | DAL_ID2732_for_PyTorch/utils/augment.py | https://medium.com/uruvideo/dataset-augmentation-with-random-homographies-a8f4b44830d4 | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/ResultMerge_multi_process.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/ResultMerge.py | http://captain.whu.edu.cn/DOTAweb/tasks.html | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/overlaps_cuda/setup.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/poly_nms_gpu/setup.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 相关说明 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/augment.py | DAL_ID2732_for_PyTorch/utils/augment.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/bbox.py | DAL_ID2732_for_PyTorch/utils/bbox.py | http://fromwiz.com/share/s/34GeEW1RFx7x2iIM0z1ZXVvc2yLl5t2fTkEg2ZVhJR2n50xg | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/augment.py | DAL_ID2732_for_PyTorch/utils/augment.py | https://towardsdatascience.com/when-conventional-wisdom-fails-revisiting-data-augmentation-for-self-driving-cars-4831998c5509 | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/overlaps_cuda/setup.py | DAL_ID2732_for_PyTorch/utils/overlaps_cuda/setup.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 相关说明 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/ResultMerge_multi_process.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/dota_evaluation_task2.py | http://captain.whu.edu.cn/DOTAweb/tasks.html | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/ResultMerge_multi_process.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/dota-v1.5_evaluation_task1.py | http://captain.whu.edu.cn/DOTAweb/tasks.html | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/models/losses.py | DAL_ID2732_for_PyTorch/models/losses.py | https://github.com/facebookresearch/maskrcnn-benchmark | 源码实现 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/ResultMerge_multi_process.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/dota_evaluation_task1.py | http://captain.whu.edu.cn/DOTAweb/tasks.html | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/ResultMerge_multi_process.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/dota-v1.5_evaluation_task2.py | http://captain.whu.edu.cn/DOTAweb/tasks.html | 数据集地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/models/losses.py | DAL_ID2732_for_PyTorch/models/main_losses.py | https://github.com/facebookresearch/maskrcnn-benchmark | 源码实现 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/utils.py | DAL_ID2732_for_PyTorch/utils/utils.py | https://github.com/pytorch/vision/issues/223 | 相关说明 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/utils.py | DAL_ID2732_for_PyTorch/utils/utils.py | https://pytorch.org/docs/stable/notes/randomness.html | 相关说明 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/ResultMerge_multi_process.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/ResultMerge_multi_process.py | http://captain.whu.edu.cn/DOTAweb/tasks.html | 数据集地址 | -| 开发引入 | / | DAL_ID2732_for_PyTorch/models/losses.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/utils/augment.py | DAL_ID2732_for_PyTorch/utils/augment.py | https://github.com/hysts/pytorch_cutout/blob/master/dataloader.py | 源码实现 | -| 开源代码引入 | https://github.com/ming71/DAL/blob/48cd29fdbf5eeea1b5b642bd1f04bbf1863b31e3/datasets/DOTA_devkit/polyiou.py | DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/polyiou.py | http://www.swig.org | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DAL_ID2732_for_PyTorch/datasets/DOTA_devkit/DOTA2COCO.py | http://captain.whu.edu.cn/DOTAweb/ | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/public_address_statement.md index 78b14a63b46aaf47f2004220a4e932f282d4fa4b..e81a6134c17190f804dceb8f9110297eec3afae4 100644 --- a/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/public_address_statement.md @@ -1,14 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |-------------------------------------------------------------------------------------------|-------------------------------------------| ------------------------------------ |--------| -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/resnet.py | DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/resnet.py | DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/resnet.py | DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/resnet.py | DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/resnet.py | DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | DB_ID0706_for_PyTorch/Dockerfile | http://download.osgeo.org/geos/geos-3.8.1.tar.bz2 | 下载第三方包 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/mobilenetv3.py | DB_ID0706_for_PyTorch/backbones/mobilenetv3.py | https://github.com/kuan-wang/pytorch-mobilenet-v3 | 源码实现 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/backbones/mobilenetv3.py | DB_ID0706_for_PyTorch/backbones/mobilenetv3.py | https://github.com/PaddlePaddle/PaddleOCR/blob/release/2.1/ppocr/modeling/backbones/det_mobilenet_v3.py | 源码实现 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/data/processes/random_crop_data.py | DB_ID0706_for_PyTorch/data/processes/random_crop_data.py | https://github.com/argman/EAST | 源码实现 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/decoders/dice_loss.py | DB_ID0706_for_PyTorch/decoders/dice_loss.py | https://arxiv.org/abs/1707.03237 | 论文地址 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/concern/__init__.py | DB_ID0706_for_PyTorch/concern/visualizer.py | wanzhaoyi@megvii.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MhLiao/DB/4ac194d0357fd102ac871e37986cb8027ecf094e/concern/__init__.py | DB_ID0706_for_PyTorch/concern/__init__.py | wanzhaoyi@megvii.com | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/backbones/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DB_ID0706_for_PyTorch/url.ini | http://download.osgeo.org/geos/geos-3.8.1.tar.bz2 | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/public_address_statement.md index 5b28caa9d2c60afbc9fcef752370c1bd27d77cb9..66f48cd9298a759216f65a304115d3e3b21f74f4 100644 --- a/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/public_address_statement.md @@ -1,122 +1,12 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |------------------------------------------------------------------------------------------------------------------------|-------------------------------------------| ------------------------------------ |--------| -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/CITATION.cff | DBpp_ID4145_for_PyTorch/CITATION.cff | https://github.com/open-mmlab/mmocr | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/configs/textdet/dbnetpp/metafile.yml | DBpp_ID4145_for_PyTorch/configs/textdet/dbnetpp/metafile.yml | https://download.openmmlab.com/mmocr/textdet/dbnet/dbnetpp_r50dcnv2_fpnc_1200e_icdar2015-20220502-d7a76fff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/core/visualize.py | DBpp_ID4145_for_PyTorch/mmocr/core/visualize.py | https://download.openmmlab.com/mmocr/data/font.TTF | 下载字体文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/ocr.py | DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://download.openmmlab.com/mmocr/textdet/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/ocr.py | DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://download.openmmlab.com/mmocr/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/requirements/docs.txt | DBpp_ID4145_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 下载第三方包 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/tools/deployment/deploy_test.py | DBpp_ID4145_for_PyTorch/tools/deployment/deploy_test.py | https://github.com/open-mmlab/mmdeploy | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/tools/deployment/onnx2tensorrt.py | DBpp_ID4145_for_PyTorch/tools/deployment/onnx2tensorrt.py | https://github.com/open-mmlab/mmdeploy | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/tools/deployment/pytorch2onnx.py | DBpp_ID4145_for_PyTorch/tools/deployment/pytorch2onnx.py | https://github.com/open-mmlab/mmdeploy | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/detectors/textsnake.py | https://arxiv.org/abs/1807.01544 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/postprocess/utils.py | https://github.com/GXYM/DRRG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/apis/train.py | DBpp_ID4145_for_PyTorch/mmocr/apis/train.py | https://github.com/open-mmlab/mmdetection/issues/6339 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/db_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/postprocess/db_postprocessor.py | https://github.com/MhLiao/DB | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pse_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/pse_loss.py | https://github.com/whai362/PSENet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/kie/losses/sdmgr_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/kie/extractors/sdmgr.py | https://arxiv.org/abs/2103.14470 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/dense_heads/textsnake_head.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/fce_head.py | https://github.com/open-mmlab/mmocr/pull/640 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/backbones/very_deep_vgg.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/backbones/very_deep_vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/textsnake_head.py | https://arxiv.org/abs/1807.01544 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmcv_need/setup.py | https://github.com/pytorch/pytorch/pull/45956 | 源码实现 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/drrg_targets.py | https://github.com/GXYM/DRRG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/textsnake_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/textsnake_targets.py | https://github.com/princewang1994/TextSnake.pytorch | 源码实现 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/datasets/utils/backend.py | https://mmocr.readthedocs.io/en/latest/tools.html# | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/dense_heads/textsnake_head.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/db_head.py | https://github.com/open-mmlab/mmocr/pull/640 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pan_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/panet_targets.py | https://github.com/WenmuZhou/PAN.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/textsnake_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/textsnake_loss.py | https://github.com/princewang1994/TextSnake.pytorch | 源码实现 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/drrg_loss.py | https://github.com/GXYM/DRRG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/sar.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/sar_decoder.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/layers/satrn_layers.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/layers/satrn_layers.py | https://github.com/Media-Smart/vedastr | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/encoders/transformer.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/backbones/resnet_abi.py | https://github.com/FangShancheng/ABINet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/ner/utils/bert.py | DBpp_ID4145_for_PyTorch/mmocr/models/ner/utils/bert.py | https://github.com/lonePatient/BERT-NER-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/docs/zh_cn/conf.py | DBpp_ID4145_for_PyTorch/mmcv_need/setup.py | https://github.com/open-mmlab/mmcv | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/db_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/db_head.py | https://github.com/MhLiao/DB | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/ner/utils/bert.py | DBpp_ID4145_for_PyTorch/mmocr/models/ner/utils/activations.py | https://github.com/lonePatient/BERT-NER-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pse_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/psenet_targets.py | https://github.com/whai362/PSENet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/satrn.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/encoders/satrn_encoder.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/core/evaluation/ocr_metric.py | https://rrc.cvc.uab.es/?ch=14&com=tasks | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/modules/local_graph.py | https://arxiv.org/abs/2003.07493 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmcv_need/setup.py | https://github.com/open-mmlab/mmcv/pull/1463 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/ocr.py | DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://github.com/UB-Mannheim/tesseract/wiki | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/abinet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/recognizer/abinet.py | https://arxiv.org/pdf/2103.06495.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/textsnake_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/postprocess/textsnake_postprocessor.py | https://github.com/princewang1994/TextSnake.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/robust_scanner_decoder.py | https://arxiv.org/abs/2007.07542 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/necks/fpn_unet.py | https://arxiv.org/abs/2003.07493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/satrn.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/fusers/abi_fuser.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/modules/proposal_local_graph.py | https://github.com/GXYM/DRRG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/drrg_head.py | https://arxiv.org/abs/2003.07493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/backbones/shallow_cnn.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/backbones/shallow_cnn.py | https://arxiv.org/pdf/1910.04396.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/configs/textdet/psenet/README.md | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/psenet_targets.py | https://arxiv.org/abs/1903.12473 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | https://arxiv.org/abs/2007.07542 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/detectors/drrg.py | https://arxiv.org/abs/2003.07493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/robust_scanner.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/recognizer/robust_scanner.py | https://arxiv.org/pdf/2007.07542.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/detectors/dbnet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/detectors/dbnet.py | https://arxiv.org/abs/1911.08947 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/ner/utils/bert.py | https://github.com/lonePatient/BERT-NER- | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/pan_loss.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/panet_targets.py | https://arxiv.org/abs/1908.05900 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/ner/utils/activations.py | DBpp_ID4145_for_PyTorch/mmocr/models/ner/utils/activations.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/box_util.py | DBpp_ID4145_for_PyTorch/mmocr/utils/box_util.py | https://github.com/faustomorales/keras-ocr/issues/22 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/decoders/sequence_attention_decoder.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/position_attention_decoder.py | https://arxiv.org/abs/2007.07542 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/modules/utils.py | https://github.com/GXYM/DRRG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pan_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/pan_loss.py | https://github.com/WenmuZhou/PAN.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/pse_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/detectors/psenet.py | https://arxiv.org/abs/1806.02559 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/encoders/transformer.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/encoders/transformer.py | https://github.com/FangShancheng/ABINet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/sar.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/recognizer/sar.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/db_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/db_loss.py | https://github.com/MhLiao/DB | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | https://arxiv.org/pdf/1603.03915.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_cat.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/necks/fpn_cat.py | https://github.com/WenmuZhou/DBNet.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/common/backbones/unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/common/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/backbones/resnet31_ocr.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/backbones/resnet_abi.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/db_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/dbnet_targets.py | https://github.com/MhLiao/DB | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/backbones/resnet31_ocr.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/backbones/resnet31_ocr.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/necks/fpn_unet.py | https://arxiv.org/abs/1807.01544 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/encoders/transformer.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/transforms.py | https://github.com/FangShancheng/ABINet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pan_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/postprocess/pan_postprocessor.py | https://github.com/WenmuZhou/PAN.pytorch | 源码实现 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/base_textdet_targets.py | https://en.wikipedia.org/wiki/Green%27s_theorem | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/satrn.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/layers/satrn_layers.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/textsnake_loss.py | https://arxiv.org/abs/1807.01544 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/utils/setup_env.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/fce_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/fce_head.py | https://arxiv.org/abs/2104.10442 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/dense_heads/textsnake_head.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/textsnake_head.py | https://github.com/open-mmlab/mmocr/pull/640 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/drrg_targets.py | https://arxiv.org/abs/2003.07493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/decoders/abinet_vision_decoder.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/abinet_language_decoder.py | https://arxiv.org/pdf/2103.06495 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pan_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/necks/fpem_ffm.py | https://github.com/WenmuZhou/PAN.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/preprocessor/tps_preprocessor.py | https://github.com/clovaai/deep-text-recognition-benchmark | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/db_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/necks/fpn_cat.py | https://github.com/MhLiao/DB | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/master.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/recognizer/master.py | https://arxiv.org/abs/1910.02562 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/satrn.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/recognizer/satrn.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/fce_loss.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/fcenet_targets.py | https://arxiv.org/abs/2104.10442 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/modules/local_graph.py | https://github.com/GXYM/DRRG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/textsnake_targets.py | https://arxiv.org/abs/1807.01544 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/detectors/dbnet.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/dbnet_targets.py | https://arxiv.org/abs/1911.08947 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/pse_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/postprocess/pse_postprocessor.py | https://github.com/whai362/PSENet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/necks/fpn_unet.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/modules/proposal_local_graph.py | https://arxiv.org/abs/2003.07493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/sar.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/encoders/sar_encoder.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/fce_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/detectors/fcenet.py | https://arxiv.org/abs/2104.10442 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/box_util.py | DBpp_ID4145_for_PyTorch/mmocr/utils/box_util.py | https://github.com/novioleo/Savior/blob/master/Utils/GeometryUtils.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/fce_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/fce_loss.py | https://arxiv.org/abs/2104.10442 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/model.py | DBpp_ID4145_for_PyTorch/mmocr/utils/model.py | https://github.com/pytorch/pytorch/issues/41081#issuecomment-783961547 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/pan_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/detectors/panet.py | https://arxiv.org/abs/1908.05900 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/common/modules/transformer_module.py | DBpp_ID4145_for_PyTorch/mmocr/models/common/modules/transformer_module.py | https://github.com/jadore801120/attention-is-all-you-need-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/kie/losses/sdmgr_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/kie/losses/sdmgr_loss.py | https://arxiv.org/abs/2103.14470 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/dense_heads/textsnake_head.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/drrg_head.py | https://github.com/open-mmlab/mmocr/pull/640 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/nrtr.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/recognizer/nrtr.py | https://arxiv.org/pdf/1806.00926.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/ner/utils/bert.py | DBpp_ID4145_for_PyTorch/mmocr/models/ner/utils/bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/utils/ocr.py | DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://github.com/sirfz/tesserocr | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/sar.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/losses/ce_loss.py | https://arxiv.org/abs/1811.00751 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/dense_heads/textsnake_head.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/dense_heads/pan_head.py | https://github.com/open-mmlab/mmocr/pull/640 | 源码实现 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/apis/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/datasets/utils/parser.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/utils/parser.py | https://mmocr.readthedocs.io/en/latest/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/pse_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/pse_loss.py | https://arxiv.org/abs/1806.02559 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/satrn.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/encoders/abinet_vision_model.py | https://arxiv.org/abs/1910.04396 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/pan_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/pan_loss.py | https://arxiv.org/abs/1908.05900 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/decoders/master_decoder.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/master_decoder.py | https://github.com/wenwenyu/MASTER-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/decoders/abinet_vision_decoder.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/abinet_vision_decoder.py | https://arxiv.org/pdf/2103.06495 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/datasets/pipelines/dbnet_transforms.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/dbnet_transforms.py | https://github.com/aleju/imgaug | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/postprocess/db_postprocessor.py | DBpp_ID4145_for_PyTorch/mmocr/datasets/pipelines/textdet_targets/base_textdet_targets.py | https://github.com/MhLiao/DB | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/setup.py | DBpp_ID4145_for_PyTorch/mmcv_need/setup.py | openmmlab@gmail.com | 邮箱地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/modules/gcn.py | https://github.com/Zhongdao/gcn_clustering | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textdet/losses/pan_loss.py | DBpp_ID4145_for_PyTorch/mmocr/models/textdet/losses/drrg_loss.py | https://arxiv.org/abs/1908.05900 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/setup.py | DBpp_ID4145_for_PyTorch/mmcv_need/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmocr/blob/26bc4713d4a451ed510a67be0a4fdd9903fd9011/mmocr/models/textrecog/recognizer/master.py | DBpp_ID4145_for_PyTorch/mmocr/models/textrecog/decoders/master_decoder.py | https://arxiv.org/abs/1910.02562 | 论文地址 | -| 开发引入 | / | DBpp_ID4145_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/configs/textdet/dbnetpp/metafile.yml | https://download.openmmlab.com/mmocr/textdet/dbnet/dbnetpp_r50dcnv2_fpnc_1200e_icdar2015-20220502-d7a76fff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/configs/textdet/dbnetpp/metafile.yml | https://arxiv.org/abs/2202.10304 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmcv_need/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmocr/core/visualize.py | http://files.grouplens.org/datasets/movielens/ml-100k/u.data | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmocr/datasets/utils/backend.py | https://mmocr.readthedocs.io/en/latest/tools.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmocr/datasets/utils/parser.py | https://mmocr.readthedocs.io/en/latest/ | 相关文档 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://download.openmmlab.com/mmocr/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://download.openmmlab.com/mmocr/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/mmocr/utils/ocr.py | https://download.openmmlab.com/mmocr/textdet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/DBpp_ID4145_for_PyTorch/url.ini | http://download.openmmlab.com/mmcv/dist/npu/torch1.8.0/index.html | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/public_address_statement.md index ce433390c2d16662ccc337e0a08d41489922b5c5..67232aae6efe9040a44e19c22ee07641e60c972e 100644 --- a/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/public_address_statement.md @@ -1,109 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |---------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------|-------------------------------------------|----------| -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/datasets/prepare_for_tests.sh | Faster_Mask_RCNN_for_PyTorch/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/datasets/prepare_panoptic_fpn.py | Faster_Mask_RCNN_for_PyTorch/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/checkpoint/catalog.py | Faster_Mask_RCNN_for_PyTorch/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/checkpoint/catalog.py | Faster_Mask_RCNN_for_PyTorch/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/model_zoo/model_zoo.py | Faster_Mask_RCNN_for_PyTorch/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/dev/packaging/build_wheel.sh | Faster_Mask_RCNN_for_PyTorch/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/ | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/dev/packaging/gen_install_table.py | Faster_Mask_RCNN_for_PyTorch/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/ | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | https://github.com/facebookresearch/fvcore | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | https://github.com/facebookresearch/detectron2 | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile-circleci | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile-circleci | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | https://github.com/facebookresearch/detectron2/blob/master/ | 源码地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | https://docs.python.org/3.6 | 第三方包说明文档 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | https://docs.scipy.org/doc/numpy/ | 第三方包说明文档 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 第三方包说明文档 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | https://arxiv.org/abs/ | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/setup.py | Faster_Mask_RCNN_for_PyTorch/setup.py | https://github.com/facebookresearch/detectron2 | 源码地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/setup.py | Faster_Mask_RCNN_for_PyTorch/setup.py | https://github.com/psf/black@673327449f86fce558adde153bb6cbe54bfebad2 | 源码地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Faster_Mask_RCNN_for_PyTorch/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg | 数据地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Faster_Mask_RCNN_for_PyTorch/tests/data/test_coco_evaluation.py | http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg | 标注数据地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Faster_Mask_RCNN_for_PyTorch/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000139.jpg | 数据地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Faster_Mask_RCNN_for_PyTorch/tests/data/test_coco_evaluation.py | http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 标注数据地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/test_model_zoo.py | Faster_Mask_RCNN_for_PyTorch/tests/test_model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tools/convert-torchvision-to-d2.py | Faster_Mask_RCNN_for_PyTorch/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/env.py | Faster_Mask_RCNN_for_PyTorch/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/export/torchscript.py | Faster_Mask_RCNN_for_PyTorch/detectron2/export/torchscript.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/coco.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/visualizer.py | Faster_Mask_RCNN_for_PyTorch/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/wrappers.py | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/projects/DensePose/densepose/data/datasets/coco.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rpn.py | Faster_Mask_RCNN_for_PyTorch/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/pull/41371 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/image_list.py | Faster_Mask_RCNN_for_PyTorch/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/31734 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/model_zoo/__init__.py | Faster_Mask_RCNN_for_PyTorch/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/modeling/test_matcher.py | Faster_Mask_RCNN_for_PyTorch/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg","http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg","http://images.cocodataset.org/val2017/000000000139.jpg","http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/batch_norm.py | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rrpn.py | Faster_Mask_RCNN_for_PyTorch/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/augmentation_impl.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rpn.py | Faster_Mask_RCNN_for_PyTorch/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/projects/DensePose/densepose/data/datasets/coco.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/logger.py | Faster_Mask_RCNN_for_PyTorch/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/config/defaults.py | Faster_Mask_RCNN_for_PyTorch/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关说明 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/serialize.py | Faster_Mask_RCNN_for_PyTorch/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/coco.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/wrappers.py | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/config/defaults.py | Faster_Mask_RCNN_for_PyTorch/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/lvis.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/modeling/test_matcher.py | Faster_Mask_RCNN_for_PyTorch/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/pull/38378 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/batch_norm.py | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 源码实现 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/vision.cpp | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/evaluation/sem_seg_evaluation.py | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/poolers.py | Faster_Mask_RCNN_for_PyTorch/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/vision.cpp | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/boxes.py | Faster_Mask_RCNN_for_PyTorch/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/modeling/test_matcher.py | Faster_Mask_RCNN_for_PyTorch/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rrpn.py | Faster_Mask_RCNN_for_PyTorch/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/notes/compatibility.md | Faster_Mask_RCNN_for_PyTorch/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/evaluation/pascal_voc_evaluation.py | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/launch.py | Faster_Mask_RCNN_for_PyTorch/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/tutorials/datasets.md | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/structures/test_boxes.py | Faster_Mask_RCNN_for_PyTorch/tests/structures/test_boxes.py | https://github.com/pytorch/pytorch/pull/39336 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/boxes.py | Faster_Mask_RCNN_for_PyTorch/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/layers/test_roi_align.py | Faster_Mask_RCNN_for_PyTorch/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/wrappers.py | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/train_loop.py | Faster_Mask_RCNN_for_PyTorch/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/register_coco.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/register_coco.py | http://cocodataset.org/#format-data | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Faster_Mask_RCNN_for_PyTorch/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/defaults.py | Faster_Mask_RCNN_for_PyTorch/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/export/shared.py | Faster_Mask_RCNN_for_PyTorch/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/transform.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关说明 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/masks.py | Faster_Mask_RCNN_for_PyTorch/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/builtin_meta.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/evaluation/sem_seg_evaluation.py | Faster_Mask_RCNN_for_PyTorch/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/tutorials/datasets.md | Faster_Mask_RCNN_for_PyTorch/detectron2/config/defaults.py | http://cocodataset.org/#keypoints-eval | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/launch.py | Faster_Mask_RCNN_for_PyTorch/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | Faster_Mask_RCNN_for_PyTorch/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/transform.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rpn.py | Faster_Mask_RCNN_for_PyTorch/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/datasets/prepare_cocofied_lvis.py | Faster_Mask_RCNN_for_PyTorch/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 源码实现 | -| 开发引入 | / | Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | http://images.cocodataset.org/val2017/000000439715.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/register_coco.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/dev/packaging/build_wheel.sh | Faster_Mask_RCNN_for_PyTorch/dev/packaging/build_wheel.sh | https://github.com/NVIDIA/nvidia-docker/issues/854 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/image_list.py | Faster_Mask_RCNN_for_PyTorch/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/39308 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/transform.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Faster_Mask_RCNN_for_PyTorch/detectron2/data/detection_utils.py | https://www.exiv2.org/tags.html | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/Faster_Mask_RCNN_for_PyTorch/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/public_address_statement.md index 379e522ca0bc25f3733d57d98b06745a0126648a..b5ba72992b41aeb4274ad613e1bb3dd219a22a74 100644 --- a/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/public_address_statement.md @@ -1,28 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|------------------------------------------------------|-------------------------------------------------------------|--------| -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/models/fpn_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/models/fpn_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/models/fpn_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/models/fpn_resnet.py | https://download.pytorch.org/models/resnet101-5d3mb4d8f.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/models/fpn_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/onnx_models/fpn_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/onnx_models/fpn_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/onnx_models/fpn_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/onnx_models/fpn_resnet.py | https://download.pytorch.org/models/resnet101-5d3mb4d8f.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/onnx_models/fpn_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/models/fpn_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/models/fpn_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/models/fpn_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/models/fpn_resnet.py | https://download.pytorch.org/models/resnet101-5d3mb4d8f.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/models/fpn_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/util/img.py | http://docs.opencv.org/2.4/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=pointpolygontest#cv.PointPolygonTest | 相关说明 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/metrics.py | https://github.com/wkentaro/pytorch-fcn/blob/master/torchfcn/utils.py | 源码实现 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/util/tf.py | http://stackoverflow.com/questions/38559755/how-to-get-current-available-gpus-in-tensorflow | 相关说明 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/util/tf.py | https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/platform/test.py | 源码实现 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/util/tf.py | http://stackoverflow.com/questions/38559755/how-to-get-current-available-gpus-in-tensorflow | 相关说明 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/util/feature.py | https://github.com/scikit-image/scikit-image/blob/master/skimage/feature/_hog.py | 源码实现 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/util/feature.py | https://github.com/scikit-image/scikit-image/blob/master/skimage/feature/_hog.py | 源码实现 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/util/tf.py | https://github.com/tensorflow/tensorflow/blob/master/tensorflow/python/platform/test.py | 源码实现 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/src/util/img.py | https://www.oschina.net/translate/opencv-rotation | 相关说明 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/util/img.py | https://www.oschina.net/translate/opencv-rotation | 相关说明 | -| 开发引入 | / | PSENet_for_PyTorch/NPU/test/util/img.py | http://docs.opencv.org/2.4/modules/imgproc/doc/structural_analysis_and_shape_descriptors.html?highlight=pointpolygontest#cv.PointPolygonTest | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------|-------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/PSENet_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3mb4d8f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/public_address_statement.md index 1baea7747a1f39ddb90b4bcaee6129e4a5ed8eb2..6182aeab30942300545347a8fe559f9b90d75bf6 100644 --- a/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/public_address_statement.md @@ -1,20 +1,12 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-----------------------------------------------------------------------------------------|---------------------------------------------------------|----------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/rfcn/resnet_atrous.py | RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/rfcn/resnet_atrous.py | RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/rfcn/resnet_atrous.py | RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/rfcn/resnet_atrous.py | RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/rfcn/resnet_atrous.py | RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/faster_rcnn/resnet.py | RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/faster_rcnn/resnet.py | RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/faster_rcnn/resnet.py | RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/faster_rcnn/resnet.py | RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/faster_rcnn/resnet.py | RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | constant.py | 127.0.0.1 | 本机IP地址 | -| 开发引入 | / | RFCN_ID0418_for_PyTorch/lib/model/rpn/proposal_target_layer_cascade.py | https://github.com/pytorch/pytorch/issues/1868 | 相关说明 | -| 开发引入 | / | RFCN_ID0418_for_PyTorch/lib/model/rpn/anchor_target_layer.py | https://github.com/pytorch/pytorch/issues/1868 | 相关说明 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/datasets/tools/mcg_munge.py | RFCN_ID0418_for_PyTorch/lib/datasets/tools/mcg_munge.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/mcg/ | 相关说明 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/utils/net_utils.py | RFCN_ID0418_for_PyTorch/lib/model/utils/net_utils.py | https://github.com/ruotianluo/pytorch-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/utils/logger.py | RFCN_ID0418_for_PyTorch/lib/model/utils/logger.py | https://gist.github.com/gyglim/1f8dfb1b5c82627ae3efcfbbadb9f514 | 相关说明 | -| 开发引入 | / | RFCN_ID0418_for_PyTorch/lib/datasets/tools/mcg_munge.py | http://www.mpi-inf.mpg.de/departments/computer-vision-and-multimodal- | 相关说明 | -| 开源代码引入 | https://github.com/RebornL/RFCN-pytorch.1.0/blob/master/lib/model/csrc/cuda/PSROIAlign_cuda.cu | RFCN_ID0418_for_PyTorch/lib/model/csrc/cuda/PSROIAlign_cuda.cu | http://blog.prince2015.club/2018/07/13/R-FCN/ | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------|----------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/faster_rcnn/resnet.py | https://s3.amazonaws.com/pytorch/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RFCN_ID0418_for_PyTorch/lib/model/rfcn/resnet_atrous.py | https://s3.amazonaws.com/pytorch/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/public_address_statement.md index b3012b7b6fd5d9d4a4c80938c53b2e7e7ce29720..1ffddfc7b3f05695f145b3e5f2246bdd902b5747 100644 --- a/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/public_address_statement.md @@ -1,162 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://gitlab.com/pycqa/flake8.git | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://github.com/asottile/seed-isort-config | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://github.com/timothycrosley/isort | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://github.com/pre-commit/mirrors-yapf | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://github.com/pre-commit/pre-commit-hooks | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://github.com/jumanjihouse/pre-commit-hooks | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/.pre-commit-config.yaml | RetinaNet_for_PyTorch/.pre-commit-config.yaml | https://github.com/myint/docformatter | 第三方包源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | RetinaNet_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | RetinaNet_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | RetinaNet_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | RetinaNet_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/docker/Dockerfile | RetinaNet_for_PyTorch/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 下载第三方包 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/docker/Dockerfile | RetinaNet_for_PyTorch/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 下载第三方包 | -| 开发引入 | / | url.ini | https://github.com/open-mmlab/mmcv.git | 下载第三方包 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/docs/stat.py | RetinaNet_for_PyTorch/docs/stat.py | https://github.com/open-mmlab/mmdetection/blob/master/ | 第三方包源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/lvis.py | RetinaNet_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/setup.py | RetinaNet_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/tests/async_benchmark.py | RetinaNet_for_PyTorch/tests/async_benchmark.py | http://download.openmmlab.com/mmdetection/v2.0 | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/nasfcos_fpn.py | RetinaNet_for_PyTorch/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/backbones/hourglass.py | RetinaNet_for_PyTorch/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/iou_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/ghm_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/faster_rcnn.py | RetinaNet_for_PyTorch/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/hrfpn.py | RetinaNet_for_PyTorch/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/dynamic_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/mask_rcnn.py | RetinaNet_for_PyTorch/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/bbox/demodata.py | RetinaNet_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fsaf.py | RetinaNet_for_PyTorch/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/global_context_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/ssd_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | RetinaNet_for_PyTorch/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/cascade_rpn_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/cascade_rpn_head.py | https://arxiv.org/abs/1909.06720 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/grid_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/pipelines/transforms.py | RetinaNet_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/mask/structures.py | RetinaNet_for_PyTorch/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_heads/grid_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/iou_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/sparse_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/embedding_rpn_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/double_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/gfocal_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/yolo_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/docs/conf.py | RetinaNet_for_PyTorch/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/gfocal_loss.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/paa.py | RetinaNet_for_PyTorch/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/pipelines/transforms.py | RetinaNet_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/sparse_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/sparse_roi_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/iou_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fsaf.py | RetinaNet_for_PyTorch/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | RetinaNet_for_PyTorch/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/wider_face.py | RetinaNet_for_PyTorch/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | RetinaNet_for_PyTorch/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/paa_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_scoring_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/guided_anchoring/README.md | RetinaNet_for_PyTorch/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/iou_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/free_anchor_retina_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/lvis.py | RetinaNet_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/gaussian_focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fovea.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/mask/structures.py | RetinaNet_for_PyTorch/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/sabl/README.md | RetinaNet_for_PyTorch/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/paa_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/rfp.py | RetinaNet_for_PyTorch/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/yolact.py | RetinaNet_for_PyTorch/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/backbones/regnet.py | RetinaNet_for_PyTorch/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/pafpn.py | RetinaNet_for_PyTorch/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/atss.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/detr.py | RetinaNet_for_PyTorch/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/iou_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/atss.py | RetinaNet_for_PyTorch/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/sparse_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/detectors/sparse_rcnn.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/fcos_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/htc_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/corner_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/balanced_l1_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fast_rcnn.py | RetinaNet_for_PyTorch/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/gaussian_target.py | RetinaNet_for_PyTorch/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/docs/changelog.md | RetinaNet_for_PyTorch/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/pipelines/auto_augment.py | RetinaNet_for_PyTorch/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/detr.py | RetinaNet_for_PyTorch/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_heads/grid_head.py | RetinaNet_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_scoring_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/rfp.py | RetinaNet_for_PyTorch/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fovea.py | RetinaNet_for_PyTorch/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/bbox/demodata.py | RetinaNet_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/backbones/resnet.py | RetinaNet_for_PyTorch/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/varifocal_loss.py | RetinaNet_for_PyTorch/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/trident_faster_rcnn.py | RetinaNet_for_PyTorch/mmdet/models/detectors/trident_faster_rcnn.py | https://arxiv.org/abs/1901.01892 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/docs/make.bat | RetinaNet_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | RetinaNet_for_PyTorch/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/scnet_mask_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/builder.py | RetinaNet_for_PyTorch/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/bbox/samplers/ohem_sampler.py | RetinaNet_for_PyTorch/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/gaussian_focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/nas_fpn.py | RetinaNet_for_PyTorch/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/instaboost/README.md | RetinaNet_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/sabl/README.md | RetinaNet_for_PyTorch/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/reppoints_detector.py | RetinaNet_for_PyTorch/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/yolact.py | RetinaNet_for_PyTorch/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/varifocal_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/pipelines/instaboost.py | RetinaNet_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/nasfcos.py | RetinaNet_for_PyTorch/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/retina_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/pisa_roi_head.py | RetinaNet_for_PyTorch/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/hrfpn.py | RetinaNet_for_PyTorch/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/utils/util_mixins.py | RetinaNet_for_PyTorch/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/setup.py | RetinaNet_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/detr.py | RetinaNet_for_PyTorch/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/cascade_rcnn.py | RetinaNet_for_PyTorch/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/necks/nasfcos_fpn.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/htc_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/gaussian_focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/scnet_roi_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/varifocal_loss.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/gaussian_focal_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fcos.py | RetinaNet_for_PyTorch/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/yolact.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/point_rend_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/grid_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/detr.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/transformer_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/balanced_l1_loss.py | RetinaNet_for_PyTorch/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/configs/sabl/README.md | RetinaNet_for_PyTorch/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/paa_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/utils/res_layer.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fsaf.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/dense_heads/centripetal_head.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开发引入 | / | RetinaNet_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/detectors/fcos.py | RetinaNet_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/pisa_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/sparse_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/bbox_heads/dii_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/point_rend_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/pipelines/transforms.py | RetinaNet_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/detectors/scnet.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/feature_relay_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/mask_heads/scnet_semantic_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/cascade_roi_head.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/dataset_wrappers.py | RetinaNet_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | RetinaNet_for_PyTorch/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/transformer.py | RetinaNet_for_PyTorch/mmdet/models/utils/transformer.py | https://github.com/PeizeSun/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/losses/iou_loss.py | RetinaNet_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/models/utils/res_layer.py | RetinaNet_for_PyTorch/mmdet/models/roi_heads/bbox_heads/scnet_bbox_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/core/bbox/assigners/atss_assigner.py | RetinaNet_for_PyTorch/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/RetinaNet_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/SCRFD_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/SCRFD_for_PyTorch/public_address_statement.md index 1ff68c41955f94ef993c7ca2d3eec2d8d564d207..021f0e5b3381d78f5d743f6cf9122c24d0cb2ca7 100644 --- a/PyTorch/built-in/cv/detection/SCRFD_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/SCRFD_for_PyTorch/public_address_statement.md @@ -1,136 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------|-------------------------------------------------|----------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/lvis.py | SCRFD_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 下载数据集 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/setup.py | SCRFD_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/setup.py | SCRFD_for_PyTorch/setup.py | https://github.com/open-mmlab/mmdetection | 开源代码地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/grid_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/dataset_wrappers.py | SCRFD_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_heads/grid_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/rfp.py | SCRFD_for_PyTorch/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/pisa_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/yolo_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/pisa_roi_head.py | SCRFD_for_PyTorch/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/detr.py | SCRFD_for_PyTorch/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/iou_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/nasfcos_fpn.py | SCRFD_for_PyTorch/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/paa_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fcos.py | SCRFD_for_PyTorch/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/iou_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/focal_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fsaf.py | SCRFD_for_PyTorch/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gfocal_loss.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/scrfd_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/corner_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/pipelines/transforms.py | SCRFD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/bbox/assigners/atss_assigner.py | SCRFD_for_PyTorch/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_scoring_roi_head.py | SCRFD_for_PyTorch/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/nas_fpn.py | SCRFD_for_PyTorch/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/atss.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/bbox/demodata.py | SCRFD_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/paa_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gfocal_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gaussian_focal_loss.py | SCRFD_for_PyTorch/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/iou_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/wider_face.py | SCRFD_for_PyTorch/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/detr.py | SCRFD_for_PyTorch/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/point_rend_roi_head.py | SCRFD_for_PyTorch/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gaussian_focal_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/nasfcos_fpn.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/pafpn.py | SCRFD_for_PyTorch/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/balanced_l1_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/bbox/samplers/ohem_sampler.py | SCRFD_for_PyTorch/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/backbones/hourglass.py | SCRFD_for_PyTorch/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gaussian_focal_loss.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/detr.py | SCRFD_for_PyTorch/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fsaf.py | SCRFD_for_PyTorch/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/mask_rcnn.py | SCRFD_for_PyTorch/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/atss.py | SCRFD_for_PyTorch/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/focal_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/reppoints_detector.py | SCRFD_for_PyTorch/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fcos.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/bbox/demodata.py | SCRFD_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/point_rend_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/grid_roi_head.py | SCRFD_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SCRFD_for_PyTorch/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmcv_need/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/faster_rcnn.py | SCRFD_for_PyTorch/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/double_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/yolact.py | SCRFD_for_PyTorch/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | SCRFD_for_PyTorch/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/iou_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SCRFD_for_PyTorch/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/balanced_l1_loss.py | SCRFD_for_PyTorch/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/paa_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/htc_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/utils/gaussian_target.py | SCRFD_for_PyTorch/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fsaf.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/mask/structures.py | SCRFD_for_PyTorch/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gaussian_focal_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_heads/grid_head.py | SCRFD_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/mask/structures.py | SCRFD_for_PyTorch/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/builder.py | SCRFD_for_PyTorch/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | SCRFD_for_PyTorch/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/iou_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fovea.py | SCRFD_for_PyTorch/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/ssd_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/hrfpn.py | SCRFD_for_PyTorch/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/free_anchor_retina_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/backbones/regnet.py | SCRFD_for_PyTorch/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/cascade_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/detr.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/transformer_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/fcos_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/pipelines/transforms.py | SCRFD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/cascade_rcnn.py | SCRFD_for_PyTorch/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/hrfpn.py | SCRFD_for_PyTorch/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/focal_loss.py | SCRFD_for_PyTorch/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/utils/util_mixins.py | SCRFD_for_PyTorch/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/varifocal_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/nasfcos.py | SCRFD_for_PyTorch/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/yolact.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/paa.py | SCRFD_for_PyTorch/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/varifocal_loss.py | SCRFD_for_PyTorch/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/necks/rfp.py | SCRFD_for_PyTorch/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/varifocal_loss.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/mask_scoring_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | SCRFD_for_PyTorch/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fovea.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/pipelines/auto_augment.py | SCRFD_for_PyTorch/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/pipelines/instaboost.py | SCRFD_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/retina_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/trident_faster_rcnn.py | SCRFD_for_PyTorch/mmdet/models/detectors/trident_faster_rcnn.py | https://arxiv.org/abs/1901.01892 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/backbones/resnet.py | SCRFD_for_PyTorch/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/core/evaluation/widerface.py | SCRFD_for_PyTorch/mmdet/core/evaluation/widerface.py | tianhengcheng@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/htc_roi_head.py | SCRFD_for_PyTorch/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/lvis.py | SCRFD_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/datasets/pipelines/transforms.py | SCRFD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/gfocal_loss.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/ghm_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/dense_heads/centripetal_head.py | SCRFD_for_PyTorch/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | SCRFD_for_PyTorch/mmdet/models/necks/lfpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | SCRFD_for_PyTorch/mmcv_need/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开发引入 | / | SCRFD_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/fast_rcnn.py | SCRFD_for_PyTorch/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/losses/iou_loss.py | SCRFD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/roi_heads/dynamic_roi_head.py | SCRFD_for_PyTorch/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/setup.py | SCRFD_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/deepinsight/insightface/tree/master/detection/scrfd/mmdet/models/detectors/yolact.py | SCRFD_for_PyTorch/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SCRFD_for_PyTorch/mmdet/datasets/lvis.py | https://labs.criteo.com/2014/02/kaggle-display-advertising-challenge-dataset | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SCRFD_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/SSD_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/SSD_for_PyTorch/public_address_statement.md index 18e7d1db5e39cd83f7350e7aa15bd22318faa024..653edc21c4f2657eca9363d8164500b1b6b661e5 100644 --- a/PyTorch/built-in/cv/detection/SSD_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/SSD_for_PyTorch/public_address_statement.md @@ -1,1037 +1,625 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.circleci/config.yml | SSD_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.circleci/config.yml | SSD_for_PyTorch/.circleci/config.yml | https://download.openmmlab.com/mmcv/dist/cpu/torch<< parameters.torch >>/index.html | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.circleci/config.yml | SSD_for_PyTorch/.circleci/config.yml | https://github.com/cocodataset/panopticapi.git | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.circleci/config.yml | SSD_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.circleci/config.yml | SSD_for_PyTorch/.circleci/config.yml | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.circleci/config.yml | SSD_for_PyTorch/.circleci/config.yml | https://github.com/cocodataset/panopticapi.git | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.dev_scripts/gather_models.py | SSD_for_PyTorch/.dev_scripts/gather_models.py | https://download.openmmlab.com/mmdetection/v2.0/ | 开源代码下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/PyCQA/flake8 | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/PyCQA/isort | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/pre-commit/mirrors-yapf | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/pre-commit/pre-commit-hooks | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/codespell-project/codespell | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/executablebooks/mdformat | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/myint/docformatter | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.pre-commit-config.yaml | SSD_for_PyTorch/.pre-commit-config.yaml | https://github.com/open-mmlab/pre-commit-hooks | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/CITATION.cff | SSD_for_PyTorch/CITATION.cff | https://github.com/open-mmlab/mmdetection | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/atss/metafile.yml | SSD_for_PyTorch/configs/atss/metafile.yml | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/atss/metafile.yml | SSD_for_PyTorch/configs/atss/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/atss.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/atss/metafile.yml | SSD_for_PyTorch/configs/atss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/atss/atss_r50_fpn_1x_coco/atss_r50_fpn_1x_coco_20200209-985f7bd0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/atss/metafile.yml | SSD_for_PyTorch/configs/atss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/atss/atss_r101_fpn_1x_coco/atss_r101_fpn_1x_20200825-dfcadd6f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/autoassign/metafile.yml | SSD_for_PyTorch/configs/autoassign/metafile.yml | https://arxiv.org/abs/2007.03496 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/autoassign/metafile.yml | SSD_for_PyTorch/configs/autoassign/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.12.0/mmdet/models/detectors/autoassign.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/autoassign/metafile.yml | SSD_for_PyTorch/configs/autoassign/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/autoassign/auto_assign_r50_fpn_1x_coco/auto_assign_r50_fpn_1x_coco_20210413_115540-5e17991f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/carafe/metafile.yml | SSD_for_PyTorch/configs/carafe/metafile.yml | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/carafe/metafile.yml | SSD_for_PyTorch/configs/carafe/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.12.0/mmdet/models/necks/fpn_carafe.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/carafe/metafile.yml | SSD_for_PyTorch/configs/carafe/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/carafe/faster_rcnn_r50_fpn_carafe_1x_coco/faster_rcnn_r50_fpn_carafe_1x_coco_bbox_mAP-0.386_20200504_175733-385a75b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/carafe/metafile.yml | SSD_for_PyTorch/configs/carafe/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/carafe/mask_rcnn_r50_fpn_carafe_1x_coco/mask_rcnn_r50_fpn_carafe_1x_coco_bbox_mAP-0.393__segm_mAP-0.358_20200503_135957-8687f195.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/cascade_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_caffe_fpn_1x_coco/cascade_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.404_20200504_174853-b857be87.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_1x_coco/cascade_rcnn_r50_fpn_1x_coco_20200316-3dc56deb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_20e_coco/cascade_rcnn_r50_fpn_20e_coco_bbox_mAP-0.41_20200504_175131-e9872a90.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_caffe_fpn_1x_coco/cascade_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.423_20200504_175649-cab8dbd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_1x_coco/cascade_rcnn_r101_fpn_1x_coco_20200317-0b6a2fbf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_20e_coco/cascade_rcnn_r101_fpn_20e_coco_bbox_mAP-0.425_20200504_231812-5057dcc5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_1x_coco/cascade_rcnn_x101_32x4d_fpn_1x_coco_20200316-95c2deb6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_20e_coco/cascade_rcnn_x101_32x4d_fpn_20e_coco_20200906_134608-9ae0a720.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_1x_coco/cascade_rcnn_x101_64x4d_fpn_1x_coco_20200515_075702-43ce6a30.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_20e_coco/cascade_rcnn_x101_64x4d_fpn_20e_coco_20200509_224357-051557b1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_1x_coco/cascade_mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.412__segm_mAP-0.36_20200504_174659-5004b251.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_1x_coco/cascade_mask_rcnn_r50_fpn_1x_coco_20200203-9d4dcb24.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_1x_coco/cascade_mask_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.432__segm_mAP-0.376_20200504_174813-5c1e9599.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_1x_coco/cascade_mask_rcnn_r101_fpn_1x_coco_20200203-befdf6ee.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_20e_coco/cascade_mask_rcnn_r101_fpn_20e_coco_bbox_mAP-0.434__segm_mAP-0.378_20200504_174836-005947da.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco_20200201-0f411b1f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco_20200528_083917-ed1f4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco_20200203-9a2db89d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco_20200512_161033-bdb5126a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210707_002651-6e29b3a6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco_20210628_164719-5bdc3824.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210707_002620-a5bd2389.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco_20210628_165236-51a2d363.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210706_225234-40773067.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210719_180640-9ff7e76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rcnn/metafile.yml | SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210719_210311-d3e64ba0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rpn/metafile.yml | SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://arxiv.org/abs/1909.06720 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rpn/metafile.yml | SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.8.0/mmdet/models/dense_heads/cascade_rpn_head.py#L538 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rpn/metafile.yml | SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rpn/crpn_fast_rcnn_r50_caffe_fpn_1x_coco/crpn_fast_rcnn_r50_caffe_fpn_1x_coco-cb486e66.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cascade_rpn/metafile.yml | SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rpn/crpn_faster_rcnn_r50_caffe_fpn_1x_coco/crpn_faster_rcnn_r50_caffe_fpn_1x_coco-c8283cca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centernet/metafile.yml | SSD_for_PyTorch/configs/centernet/metafile.yml | https://arxiv.org/abs/1904.07850 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centernet/metafile.yml | SSD_for_PyTorch/configs/centernet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.13.0/mmdet/models/detectors/centernet.py#L10 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centernet/metafile.yml | SSD_for_PyTorch/configs/centernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_dcnv2_140e_coco/centernet_resnet18_dcnv2_140e_coco_20210702_155131-c8cd631f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centernet/metafile.yml | SSD_for_PyTorch/configs/centernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_140e_coco/centernet_resnet18_140e_coco_20210705_093630-bb5b3bf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centripetalnet/metafile.yml | SSD_for_PyTorch/configs/centripetalnet/metafile.yml | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centripetalnet/metafile.yml | SSD_for_PyTorch/configs/centripetalnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.5.0/mmdet/models/detectors/cornernet.py#L9 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/centripetalnet/metafile.yml | SSD_for_PyTorch/configs/centripetalnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centripetalnet/centripetalnet_hourglass104_mstest_16x6_210e_coco/centripetalnet_hourglass104_mstest_16x6_210e_coco_20200915_204804-3ccc61e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | SSD_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | SSD_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | SSD_for_PyTorch/configs/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | SSD_for_PyTorch/configs/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco.py | SSD_for_PyTorch/configs/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco_20220426_154953-050731f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco_20220509_204200-8f07c40b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco_20220510_201004-3d24f5a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/convnext/metafile.yml | SSD_for_PyTorch/configs/convnext/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cornernet/metafile.yml | SSD_for_PyTorch/configs/cornernet/metafile.yml | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cornernet/metafile.yml | SSD_for_PyTorch/configs/cornernet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.3.0/mmdet/models/detectors/cornernet.py#L9 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cornernet/metafile.yml | SSD_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_10x5_210e_coco/cornernet_hourglass104_mstest_10x5_210e_coco_20200824_185720-5fefbf1c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cornernet/metafile.yml | SSD_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_8x6_210e_coco/cornernet_hourglass104_mstest_8x6_210e_coco_20200825_150618-79b44c30.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/cornernet/metafile.yml | SSD_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_32x3_210e_coco/cornernet_hourglass104_mstest_32x3_210e_coco_20200819_203110-1efaea91.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/ops/dcn/deform_conv.py#L15 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-d68aed1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dpool_1x_coco/faster_rcnn_r50_fpn_dpool_1x_coco_20200307-90d3c01d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-1377f13d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco_20200203-4f85c69c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200203-4d9ad43b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_fp16_dconv_c3-c5_1x_coco_20210520_180247-c06429d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200216-a71f5bce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-2f1fca44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-3b2f0594.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200202-42e767a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200204-df0c5f10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcn/metafile.yml | SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco-e75f90c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/ops/dcn/deform_conv.py#L15 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200130-d099253b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco_20200130-01262257.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdpool_1x_coco/faster_rcnn_r50_fpn_mdpool_1x_coco_20200307-c0df27ff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200203-ad97591f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dcnv2/metafile.yml | SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_fp16_mdconv_c3-c5_1x_coco_20210520_180434-cf8fefa5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ddod/metafile.yml | SSD_for_PyTorch/configs/ddod/metafile.yml | https://arxiv.org/pdf/2107.02963.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ddod/metafile.yml | SSD_for_PyTorch/configs/ddod/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/ddod.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ddod/metafile.yml | SSD_for_PyTorch/configs/ddod/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ddod/ddod_r50_fpn_1x_coco/ddod_r50_fpn_1x_coco_20220523_223737-29b2fc67.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/deformable_detr/metafile.yml | SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://openreview.net/forum?id=gZ9hCDWe6ke | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/deformable_detr/metafile.yml | SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.12.0/mmdet/models/detectors/deformable_detr.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/deformable_detr/metafile.yml | SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_r50_16x2_50e_coco/deformable_detr_r50_16x2_50e_coco_20210419_220030-a12b9512.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/deformable_detr/metafile.yml | SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_refine_r50_16x2_50e_coco/deformable_detr_refine_r50_16x2_50e_coco_20210419_220503-5f5dff21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/deformable_detr/metafile.yml | SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_twostage_refine_r50_16x2_50e_coco/deformable_detr_twostage_refine_r50_16x2_50e_coco_20210419_220613-9d28ab72.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://arxiv.org/abs/2006.02334 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.2.0/mmdet/models/backbones/detectors_resnet.py#L205 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_rfp_1x_coco/cascade_rcnn_r50_rfp_1x_coco-8cf51bfd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_sac_1x_coco/cascade_rcnn_r50_sac_1x_coco-24bfda62.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_cascade_rcnn_r50_1x_coco/detectors_cascade_rcnn_r50_1x_coco-32a10ba0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_rfp_1x_coco/htc_r50_rfp_1x_coco-8ff87c51.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_sac_1x_coco/htc_r50_sac_1x_coco-bfa60c54.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detectors/metafile.yml | SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_htc_r50_1x_coco/detectors_htc_r50_1x_coco-329b1453.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detr/metafile.yml | SSD_for_PyTorch/configs/detr/metafile.yml | https://arxiv.org/abs/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detr/metafile.yml | SSD_for_PyTorch/configs/detr/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/detectors/detr.py#L7 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/detr/metafile.yml | SSD_for_PyTorch/configs/detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detr/detr_r50_8x2_150e_coco/detr_r50_8x2_150e_coco_20201130_194835-2c4b8974.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/double_heads/metafile.yml | SSD_for_PyTorch/configs/double_heads/metafile.yml | https://arxiv.org/pdf/1904.06493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/double_heads/metafile.yml | SSD_for_PyTorch/configs/double_heads/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/roi_heads/double_roi_head.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/double_heads/metafile.yml | SSD_for_PyTorch/configs/double_heads/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/double_heads/dh_faster_rcnn_r50_fpn_1x_coco/dh_faster_rcnn_r50_fpn_1x_coco_20200130-586b67df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dyhead/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco.py | SSD_for_PyTorch/configs/dyhead/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dyhead/metafile.yml | SSD_for_PyTorch/configs/dyhead/metafile.yml | https://arxiv.org/abs/2106.08322 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dyhead/metafile.yml | SSD_for_PyTorch/configs/dyhead/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.22.0/mmdet/models/necks/dyhead.py#L130 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dyhead/metafile.yml | SSD_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_r50_fpn_dyhead_for_reproduction_1x_coco/atss_r50_fpn_dyhead_for_reproduction_4x4_1x_coco_20220107_213939-162888e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dyhead/metafile.yml | SSD_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_r50_fpn_dyhead_4x4_1x_coco/atss_r50_fpn_dyhead_4x4_1x_coco_20211219_023314-eaa620c6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dyhead/metafile.yml | SSD_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco_20220509_100315-bc5b6516.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dynamic_rcnn/metafile.yml | SSD_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://arxiv.org/pdf/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dynamic_rcnn/metafile.yml | SSD_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.2.0/mmdet/models/roi_heads/dynamic_roi_head.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/dynamic_rcnn/metafile.yml | SSD_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dynamic_rcnn/dynamic_rcnn_r50_fpn_1x/dynamic_rcnn_r50_fpn_1x-62a3f276.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/efficientnet/metafile.yml | SSD_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco/retinanet_effb3_fpn_crop896_8x4_1x_coco_20220322_234806-615a0dda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/efficientnet/metafile.yml | SSD_for_PyTorch/configs/efficientnet/metafile.yml | https://arxiv.org/abs/1905.11946v5 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/efficientnet/metafile.yml | SSD_for_PyTorch/configs/efficientnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.23.0/mmdet/models/backbones/efficientnet.py#L159 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco.py | SSD_for_PyTorch/configs/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa_in1k_20220119-5b4887a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/empirical_attention/metafile.yml | SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://arxiv.org/pdf/1904.05873 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/empirical_attention/metafile.yml | SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/ops/generalized_attention.py#L10 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/empirical_attention/metafile.yml | SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_1x_coco/faster_rcnn_r50_fpn_attention_1111_1x_coco_20200130-403cccba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/empirical_attention/metafile.yml | SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_1x_coco/faster_rcnn_r50_fpn_attention_0010_1x_coco_20200130-7cb0c14d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/empirical_attention/metafile.yml | SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco_20200130-8b2523a6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/empirical_attention/metafile.yml | SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco_20200130-1a2e831d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | SSD_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | SSD_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco.py | SSD_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco.py | https://download.pytorch.org/models/resnet50-11ad3fa6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/faster_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_1x_coco/faster_rcnn_r50_caffe_c4_1x_coco_20220316_150152-3f885b85.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_mstrain_1x_coco/faster_rcnn_r50_caffe_c4_mstrain_1x_coco_20220316_150527-db276fed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_1x_coco/faster_rcnn_r50_caffe_dc5_1x_coco_20201030_151909-531f0f43.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_1x_coco/faster_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.378_20200504_180032-c5925ee5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/faster_rcnn_r50_fpn_fp16_1x_coco/faster_rcnn_r50_fpn_fp16_1x_coco_20200204-d4dc1471.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_2x_coco/faster_rcnn_r50_fpn_2x_coco_bbox_mAP-0.384_20200504_210434-a5d8aa15.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_1x_coco/faster_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.398_20200504_180057-b269e9dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_1x_coco/faster_rcnn_r101_fpn_1x_coco_20200130-f513f705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_2x_coco/faster_rcnn_r101_fpn_2x_coco_bbox_mAP-0.398_20200504_210455-1d2dac9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_1x_coco/faster_rcnn_x101_32x4d_fpn_1x_coco_20200203-cff10310.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_2x_coco/faster_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.412_20200506_041400-64a12c0b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_1x_coco/faster_rcnn_x101_64x4d_fpn_1x_coco_20200204-833ee192.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_2x_coco/faster_rcnn_x101_64x4d_fpn_2x_coco_20200512_161033-5961fa95.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_iou_1x_coco-fdd207f3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_giou_1x_coco-0eada910.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_bounded_iou_1x_coco-98ad993b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco_20201028_233851-b33d21b9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco_20201028_002107-34a53b2c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco_bbox_mAP-0.397_20200504_231813-10b2de58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210526_095054-1f77628b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_mstrain_3x_coco/faster_rcnn_r50_fpn_mstrain_3x_coco_20210524_110822-e10bd31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210526_095742-a7ae426d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_mstrain_3x_coco/faster_rcnn_r101_fpn_mstrain_3x_coco_20210524_110822-4d4d2ca8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210524_124151-16b9b260.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210604_182954-002e082a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210524_124528-26c63de6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/faster_rcnn/metafile.yml | SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco_20220320_085147-efedfda4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/fcos.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_1x_coco/fcos_r50_caffe_fpn_gn-head_1x_coco-821213aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco-0a0d75a8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco-ae4d8b3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_1x_coco/fcos_r101_caffe_fpn_gn-head_1x_coco-0e37b982.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco-d92ceeea.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco-511424d6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fcos/metafile.yml | SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco-ede514a8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/fovea.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_1x_coco/fovea_r50_fpn_4x4_1x_coco_20200219-ee4d5303.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_2x_coco/fovea_r50_fpn_4x4_2x_coco_20200203-2df792b1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_4x4_2x_coco/fovea_align_r50_fpn_gn-head_4x4_2x_coco_20200203-8987880d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200205-85ce26cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_1x_coco/fovea_r101_fpn_4x4_1x_coco_20200219-05e38f1c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_2x_coco/fovea_r101_fpn_4x4_2x_coco_20200208-02320ea4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_4x4_2x_coco/fovea_align_r101_fpn_gn-head_4x4_2x_coco_20200208-c39a027a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/foveabox/metafile.yml | SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200208-649c5eb6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://arxiv.org/abs/2004.03580 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.10.0/mmdet/models/necks/fpg.py#L101 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg_crop640_50e_coco/faster_rcnn_r50_fpg_crop640_50e_coco_20220311_011856-74109f42.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco/faster_rcnn_r50_fpg-chn128_crop640_50e_coco_20220311_011857-9376aa9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg_crop640_50e_coco/mask_rcnn_r50_fpg_crop640_50e_coco_20220311_011857-233b8334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg-chn128_crop640_50e_coco/mask_rcnn_r50_fpg-chn128_crop640_50e_coco_20220311_011859-043c9b4e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg_crop640_50e_coco/retinanet_r50_fpg_crop640_50e_coco_20220311_110809-b0bcf5f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fpg/metafile.yml | SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg-chn128_crop640_50e_coco/retinanet_r50_fpg-chn128_crop640_50e_coco_20220313_104829-ee99a686.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/free_anchor/metafile.yml | SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/free_anchor/metafile.yml | SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/dense_heads/free_anchor_retina_head.py#L10 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/free_anchor/metafile.yml | SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r50_fpn_1x_coco/retinanet_free_anchor_r50_fpn_1x_coco_20200130-0f67375f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/free_anchor/metafile.yml | SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r101_fpn_1x_coco/retinanet_free_anchor_r101_fpn_1x_coco_20200130-358324e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/free_anchor/metafile.yml | SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_x101_32x4d_fpn_1x_coco/retinanet_free_anchor_x101_32x4d_fpn_1x_coco_20200130-d4846968.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fsaf/metafile.yml | SSD_for_PyTorch/configs/fsaf/metafile.yml | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fsaf/metafile.yml | SSD_for_PyTorch/configs/fsaf/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/detectors/fsaf.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fsaf/metafile.yml | SSD_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r50_fpn_1x_coco/fsaf_r50_fpn_1x_coco-94ccc51f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fsaf/metafile.yml | SSD_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r101_fpn_1x_coco/fsaf_r101_fpn_1x_coco-9e71098f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/fsaf/metafile.yml | SSD_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_x101_64x4d_fpn_1x_coco/fsaf_x101_64x4d_fpn_1x_coco-e3f6e6fd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/ops/context_block.py#L13 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco_20200515_211915-187da160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco_20200204-17235656.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco_20200205-e58ae947.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco_20200206-af22dc9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco_20200202-bb3eb55c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200202-587b99aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200202-50b90e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco_20200210-81658c8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200207-945e77ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200206-8407a3f0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200211-7584841c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-cbed3d2c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200212-68164964.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200310-d5ad2a5e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-10bf2463.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200703_180653-ed035291.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco_20210615_211019-abbc39ea.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco_20210615_215648-44aa598a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gcnet/metafile.yml | SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco_20210615_161851-720338ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.2.0/mmdet/models/detectors/gfl.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_1x_coco/gfl_r50_fpn_1x_coco_20200629_121244-25944287.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_mstrain_2x_coco/gfl_r50_fpn_mstrain_2x_coco_20200629_213802-37bb1edc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_mstrain_2x_coco/gfl_x101_32x4d_fpn_mstrain_2x_coco_20200630_102002-50c1ffdb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gfl/metafile.yml | SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco_20200630_102002-14a2bf25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ghm/metafile.yml | SSD_for_PyTorch/configs/ghm/metafile.yml | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ghm/metafile.yml | SSD_for_PyTorch/configs/ghm/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/losses/ghm_loss.py#L21 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ghm/metafile.yml | SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r50_fpn_1x_coco/retinanet_ghm_r50_fpn_1x_coco_20200130-a437fda3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ghm/metafile.yml | SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r101_fpn_1x_coco/retinanet_ghm_r101_fpn_1x_coco_20200130-c148ee8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ghm/metafile.yml | SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_32x4d_fpn_1x_coco/retinanet_ghm_x101_32x4d_fpn_1x_coco_20200131-e4333bd0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ghm/metafile.yml | SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_64x4d_fpn_1x_coco/retinanet_ghm_x101_64x4d_fpn_1x_coco_20200131-dd381cef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://arxiv.org/abs/1803.08494 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/configs/gn/mask_rcnn_r50_fpn_gn-all_2x_coco.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_2x_coco/mask_rcnn_r50_fpn_gn-all_2x_coco_20200206-8eee02a6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_3x_coco/mask_rcnn_r50_fpn_gn-all_3x_coco_20200214-8b23b1e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_2x_coco/mask_rcnn_r101_fpn_gn-all_2x_coco_20200205-d96b1b50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_3x_coco/mask_rcnn_r101_fpn_gn-all_3x_coco_20200513_181609-0df864f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco_20200207-20d3e849.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn/metafile.yml | SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco_20200225-542aefbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://arxiv.org/abs/1903.10520 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/configs/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_2x_coco.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r50_fpn_gn_ws-all_1x_coco/faster_rcnn_r50_fpn_gn_ws-all_1x_coco_20200130-613d9fe2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r101_fpn_gn_ws-all_1x_coco/faster_rcnn_r101_fpn_gn_ws-all_1x_coco_20200205-a93b0d75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco_20200203-839c5d9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco_20200212-27da1bc2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_2x_coco/mask_rcnn_r50_fpn_gn_ws-all_2x_coco_20200226-16acb762.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_2x_coco/mask_rcnn_r101_fpn_gn_ws-all_2x_coco_20200212-ea357cd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco_20200216-649fdb6f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco_20200319-33fb95b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco_20200213-487d1283.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco_20200213-57b5a50f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200226-969bcb2c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/gn+ws/metafile.yml | SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200316-e6cd35ef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/grid_rcnn/metafile.yml | SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/grid_rcnn/metafile.yml | SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/grid_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/grid_rcnn/metafile.yml | SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r50_fpn_gn-head_2x_coco/grid_rcnn_r50_fpn_gn-head_2x_coco_20200130-6cca8223.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/grid_rcnn/metafile.yml | SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r101_fpn_gn-head_2x_coco/grid_rcnn_r101_fpn_gn-head_2x_coco_20200309-d6eca030.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/grid_rcnn/metafile.yml | SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco_20200130-d8f0e3ff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/grid_rcnn/metafile.yml | SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco_20200204-ec76a754.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/groie/metafile.yml | SSD_for_PyTorch/configs/groie/metafile.yml | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/groie/metafile.yml | SSD_for_PyTorch/configs/groie/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/roi_heads/roi_extractors/groie.py#L15 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/groie/metafile.yml | SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/faster_rcnn_r50_fpn_groie_1x_coco/faster_rcnn_r50_fpn_groie_1x_coco_20200604_211715-66ee9516.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/groie/metafile.yml | SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_groie_1x_coco/mask_rcnn_r50_fpn_groie_1x_coco_20200604_211715-50d90c74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/groie/metafile.yml | SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200604_211715-42eb79e1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/groie/metafile.yml | SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200607_224507-8daae01c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/dense_heads/ga_retina_head.py#L10 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_r50_caffe_fpn_1x_coco/ga_rpn_r50_caffe_fpn_1x_coco_20200531-899008a6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_r101_caffe_fpn_1x_coco/ga_rpn_r101_caffe_fpn_1x_coco_20200531-ca9ba8fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_x101_32x4d_fpn_1x_coco/ga_rpn_x101_32x4d_fpn_1x_coco_20200220-c28d1b18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_x101_64x4d_fpn_1x_coco/ga_rpn_x101_64x4d_fpn_1x_coco_20200225-3c6e1aa2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r50_caffe_fpn_1x_coco/ga_faster_r50_caffe_fpn_1x_coco_20200702_000718-a11ccfe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r101_caffe_fpn_1x_coco/ga_faster_r101_caffe_fpn_1x_coco_bbox_mAP-0.415_20200505_115528-fb82e499.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_32x4d_fpn_1x_coco/ga_faster_x101_32x4d_fpn_1x_coco_20200215-1ded9da3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_64x4d_fpn_1x_coco/ga_faster_x101_64x4d_fpn_1x_coco_20200215-0fa7bde7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r50_caffe_fpn_1x_coco/ga_retinanet_r50_caffe_fpn_1x_coco_20201020-39581c6f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r101_caffe_fpn_1x_coco/ga_retinanet_r101_caffe_fpn_1x_coco_20200531-6266453c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_32x4d_fpn_1x_coco/ga_retinanet_x101_32x4d_fpn_1x_coco_20200219-40c56caa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/guided_anchoring/metafile.yml | SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_64x4d_fpn_1x_coco/ga_retinanet_x101_64x4d_fpn_1x_coco_20200226-ef9f7f1f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_1x_coco/faster_rcnn_hrnetv2p_w18_1x_coco_20200130-56651a6d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_2x_coco/faster_rcnn_hrnetv2p_w18_2x_coco_20200702_085731-a4ec0611.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_1x_coco/faster_rcnn_hrnetv2p_w32_1x_coco_20200130-6e286425.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_2x_coco/faster_rcnn_hrnetv2p_w32_2x_coco_20200529_015927-976a9c15.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_1x_coco/faster_rcnn_hrnetv2p_w40_1x_coco_20200210-95c1f5ce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_2x_coco/faster_rcnn_hrnetv2p_w40_2x_coco_20200512_161033-0f236ef4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_1x_coco/mask_rcnn_hrnetv2p_w18_1x_coco_20200205-1c3d78ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_2x_coco/mask_rcnn_hrnetv2p_w18_2x_coco_20200212-b3c825b1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_1x_coco/mask_rcnn_hrnetv2p_w32_1x_coco_20200207-b29f616e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_2x_coco/mask_rcnn_hrnetv2p_w32_2x_coco_20200213-45b75b4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco/mask_rcnn_hrnetv2p_w40_1x_coco_20200511_015646-66738b35.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_2x_coco/mask_rcnn_hrnetv2p_w40_2x_coco_20200512_163732-aed5e4ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w18_20e_coco/cascade_rcnn_hrnetv2p_w18_20e_coco_20200210-434be9d7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w32_20e_coco/cascade_rcnn_hrnetv2p_w32_20e_coco_20200208-928455a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w40_20e_coco/cascade_rcnn_hrnetv2p_w40_20e_coco_20200512_161112-75e47b04.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w18_20e_coco/cascade_mask_rcnn_hrnetv2p_w18_20e_coco_20200210-b543cd2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w32_20e_coco/cascade_mask_rcnn_hrnetv2p_w32_20e_coco_20200512_154043-39d9cf7b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w40_20e_coco/cascade_mask_rcnn_hrnetv2p_w40_20e_coco_20200527_204922-969c4610.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w18_20e_coco/htc_hrnetv2p_w18_20e_coco_20200210-b266988c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w32_20e_coco/htc_hrnetv2p_w32_20e_coco_20200207-7639fa12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w40_20e_coco/htc_hrnetv2p_w40_20e_coco_20200529_183411-417c4d5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco_20201212_100710-4ad151de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco_20201212_101110-5c575fa5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco_20201211_134730-cb8055c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco_20201212_112133-77b6b9bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco_20201212_111651-441e9d9f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco_20201212_090846-b6f2b49f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco_20201212_124752-f22d2ce5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/hrnet/metafile.yml | SSD_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/htc.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_1x_coco/htc_r50_fpn_1x_coco_20200317-7332cf16.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_20e_coco/htc_r50_fpn_20e_coco_20200319-fe28c577.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r101_fpn_20e_coco/htc_r101_fpn_20e_coco_20200317-9b41b48f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_32x4d_fpn_16x1_20e_coco/htc_x101_32x4d_fpn_16x1_20e_coco_20200318-de97ae01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_16x1_20e_coco/htc_x101_64x4d_fpn_16x1_20e_coco_20200318-b181fd7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/htc/metafile.yml | SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco_20200312-946fd751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/metafile.yml | SSD_for_PyTorch/configs/instaboost/metafile.yml | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/metafile.yml | SSD_for_PyTorch/configs/instaboost/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/datasets/pipelines/instaboost.py#L7 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/metafile.yml | SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r50_fpn_instaboost_4x_coco/mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-d025f83a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/metafile.yml | SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r101_fpn_instaboost_4x_coco/mask_rcnn_r101_fpn_instaboost_4x_coco_20200703_235738-f23f3a5f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/metafile.yml | SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco_20200515_080947-8ed58c1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/metafile.yml | SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-c19d98d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/lad/lad_r50_paa_r101_fpn_coco_1x.py | SSD_for_PyTorch/configs/lad/lad_r50_paa_r101_fpn_coco_1x.py | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/lad/lad_r101_paa_r50_fpn_coco_1x.py | SSD_for_PyTorch/configs/lad/lad_r101_paa_r50_fpn_coco_1x.py | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/lad/metafile.yml | SSD_for_PyTorch/configs/lad/metafile.yml | https://arxiv.org/abs/2108.10520 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/lad/metafile.yml | SSD_for_PyTorch/configs/lad/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.19.0/mmdet/models/detectors/lad.py#L10 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ld/ld_r18_gflv1_r101_fpn_coco_1x.py | SSD_for_PyTorch/configs/ld/ld_r18_gflv1_r101_fpn_coco_1x.py | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ld/ld_r101_gflv1_r101dcn_fpn_coco_2x.py | SSD_for_PyTorch/configs/ld/ld_r101_gflv1_r101dcn_fpn_coco_2x.py | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ld/metafile.yml | SSD_for_PyTorch/configs/ld/metafile.yml | https://arxiv.org/abs/2102.12252 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ld/metafile.yml | SSD_for_PyTorch/configs/ld/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.11.0/mmdet/models/dense_heads/ld_head.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/libra_rcnn/metafile.yml | SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/libra_rcnn/metafile.yml | SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/necks/bfp.py#L10 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/libra_rcnn/metafile.yml | SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r50_fpn_1x_coco/libra_faster_rcnn_r50_fpn_1x_coco_20200130-3afee3a9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/libra_rcnn/metafile.yml | SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r101_fpn_1x_coco/libra_faster_rcnn_r101_fpn_1x_coco_20200203-8dba6a5a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/libra_rcnn/metafile.yml | SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_x101_64x4d_fpn_1x_coco/libra_faster_rcnn_x101_64x4d_fpn_1x_coco_20200315-3a7d0488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/libra_rcnn/metafile.yml | SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_retinanet_r50_fpn_1x_coco/libra_retinanet_r50_fpn_1x_coco_20200205-804d94ce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://arxiv.org/abs/1703.06870v3 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/mask_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco/mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.38__segm_mAP-0.344_20200504_231812-0ebd1859.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_1x_coco/mask_rcnn_r50_fpn_fp16_1x_coco_20200205-59faf7e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_1x_coco/mask_rcnn_r101_caffe_fpn_1x_coco_20200601_095758-805e06c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_1x_coco/mask_rcnn_r101_fpn_1x_coco_20200204-1efe0ed5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_2x_coco/mask_rcnn_r101_fpn_2x_coco_bbox_mAP-0.408__segm_mAP-0.366_20200505_071027-14b391c7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_1x_coco/mask_rcnn_x101_32x4d_fpn_1x_coco_20200205-478d0b67.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_2x_coco/mask_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.422__segm_mAP-0.378_20200506_004702-faef898c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_1x_coco/mask_rcnn_x101_64x4d_fpn_1x_coco_20200201-9352eb0d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_2x_coco/mask_rcnn_x101_64x4d_fpn_2x_coco_20200509_224208-39d6f70c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco_bbox_mAP-0.403__segm_mAP-0.365_20200504_231822-a75c98ce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_fpn_mstrain-poly_3x_coco_20210524_201154-21b550bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_fpn_mstrain-poly_3x_coco_20210524_200244-5675c317.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco_20210526_132339-3c33ce02.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco_20210524_201410-abcd7859.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco_20210607_161042-8bd2c639.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask_rcnn/metafile.yml | SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco_20210526_120447-c376f129.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic.py | SSD_for_PyTorch/configs/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://arxiv.org/pdf/2112.01527 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.23.0/mmdet/models/detectors/mask2former.py#L7 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic_20220329_225200-c7b94355.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r101_lsj_8x2_50e_coco/mask2former_r101_lsj_8x2_50e_coco_20220426_100250-c50b6fa6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r101_lsj_8x2_50e_coco-panoptic/mask2former_r101_lsj_8x2_50e_coco-panoptic_20220329_225104-c54e64c9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r50_lsj_8x2_50e_coco-panoptic/mask2former_r50_lsj_8x2_50e_coco-panoptic_20220326_224516-11a44721.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic_20220326_224553-fc567107.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r50_lsj_8x2_50e_coco/mask2former_r50_lsj_8x2_50e_coco_20220506_191028-8e96e88b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic_20220407_104949-d4919c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic_20220329_230021-3bb8b482.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic_20220331_002244-c149a9e9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco_20220508_091649-4a943037.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/mask2former/metafile.yml | SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco_20220504_001756-743b7d99.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/maskformer/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco.py | SSD_for_PyTorch/configs/maskformer/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/maskformer/metafile.yml | SSD_for_PyTorch/configs/maskformer/metafile.yml | https://arxiv.org/pdf/2107.06278 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/maskformer/metafile.yml | SSD_for_PyTorch/configs/maskformer/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.22.0/mmdet/models/detectors/maskformer.py#L7 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/maskformer/metafile.yml | SSD_for_PyTorch/configs/maskformer/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/maskformer/maskformer_r50_mstrain_16x1_75e_coco/maskformer_r50_mstrain_16x1_75e_coco_20220221_141956-bc2699cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/maskformer/metafile.yml | SSD_for_PyTorch/configs/maskformer/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/maskformer/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco_20220326_221612-061b4eb8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/mask_scoring_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_1x_coco/ms_rcnn_r50_caffe_fpn_1x_coco_20200702_180848-61c9355e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_2x_coco/ms_rcnn_r50_caffe_fpn_2x_coco_bbox_mAP-0.388__segm_mAP-0.363_20200506_004738-ee87b137.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_1x_coco/ms_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.404__segm_mAP-0.376_20200506_004755-b9b12a37.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_2x_coco/ms_rcnn_r101_caffe_fpn_2x_coco_bbox_mAP-0.411__segm_mAP-0.381_20200506_011134-5f3cc74f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_32x4d_fpn_1x_coco/ms_rcnn_x101_32x4d_fpn_1x_coco_20200206-81fd1740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_1x_coco/ms_rcnn_x101_64x4d_fpn_1x_coco_20200206-86ba88d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ms_rcnn/metafile.yml | SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_2x_coco/ms_rcnn_x101_64x4d_fpn_2x_coco_20200308-02a445e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fcos/metafile.yml | SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fcos/metafile.yml | SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/detectors/nasfcos.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fcos/metafile.yml | SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200520-1bdba3ce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fcos/metafile.yml | SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200521-7fdcbce0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fpn/metafile.yml | SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fpn/metafile.yml | SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/necks/nas_fpn.py#L67 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fpn/metafile.yml | SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_fpn_crop640_50e_coco/retinanet_r50_fpn_crop640_50e_coco-9b953d76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/nas_fpn/metafile.yml | SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_nasfpn_crop640_50e_coco/retinanet_r50_nasfpn_crop640_50e_coco-0ad1f644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/openimages/metafile.yml | SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_1x_openimages/faster_rcnn_r50_fpn_32x2_1x_openimages_20211130_231159-e87ab7ce.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/openimages/metafile.yml | SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/retinanet_r50_fpn_32x2_1x_openimages/retinanet_r50_fpn_32x2_1x_openimages_20211223_071954-d2ae5462.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/openimages/metafile.yml | SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/ssd300_32x8_36e_openimages/ssd300_32x8_36e_openimages_20211224_000232-dce93846.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/openimages/metafile.yml | SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_1x_openimages_challenge/faster_rcnn_r50_fpn_32x2_1x_openimages_challenge_20220114_045100-0e79e5df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/openimages/metafile.yml | SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_20220306_202424-98c630e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/openimages/metafile.yml | SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_challenge/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_challenge_20220221_192021-34c402d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.4.0/mmdet/models/detectors/paa.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1.5x_coco/paa_r50_fpn_1.5x_coco_20200823-805d6078.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_2x_coco/paa_r50_fpn_2x_coco_20200821-c98bfc4e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_mstrain_3x_coco/paa_r50_fpn_mstrain_3x_coco_20210121_145722-06a6880b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_2x_coco/paa_r101_fpn_2x_coco_20200821-6829f96b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/paa/metafile.yml | SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_mstrain_3x_coco/paa_r101_fpn_mstrain_3x_coco_20210122_084202-83250d22.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pafpn/metafile.yml | SSD_for_PyTorch/configs/pafpn/metafile.yml | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pafpn/metafile.yml | SSD_for_PyTorch/configs/pafpn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/necks/pafpn.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pafpn/metafile.yml | SSD_for_PyTorch/configs/pafpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pafpn/faster_rcnn_r50_pafpn_1x_coco/faster_rcnn_r50_pafpn_1x_coco_bbox_mAP-0.375_20200503_105836-b7b4b9bd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/panoptic_fpn/metafile.yml | SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://arxiv.org/pdf/1901.02446 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/panoptic_fpn/metafile.yml | SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/detectors/panoptic_fpn.py#L7 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/panoptic_fpn/metafile.yml | SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r50_fpn_1x_coco/panoptic_fpn_r50_fpn_1x_coco_20210821_101153-9668fd13.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/panoptic_fpn/metafile.yml | SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r50_fpn_mstrain_3x_coco/panoptic_fpn_r50_fpn_mstrain_3x_coco_20210824_171155-5650f98b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/panoptic_fpn/metafile.yml | SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r101_fpn_1x_coco/panoptic_fpn_r101_fpn_1x_coco_20210820_193950-ab9157a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/panoptic_fpn/metafile.yml | SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r101_fpn_mstrain_3x_coco/panoptic_fpn_r101_fpn_mstrain_3x_coco_20210823_114712-9c99acc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/roi_heads/pisa_roi_head.py#L8 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_r50_fpn_1x_coco/pisa_faster_rcnn_r50_fpn_1x_coco-dea93523.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco-e4accec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_mask_rcnn_r50_fpn_1x_coco/pisa_mask_rcnn_r50_fpn_1x_coco-dfcedba6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_r50_fpn_1x_coco/pisa_retinanet_r50_fpn_1x_coco-76409952.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_x101_32x4d_fpn_1x_coco/pisa_retinanet_x101_32x4d_fpn_1x_coco-a0c13c73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd300_coco/pisa_ssd300_coco-710e3ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pisa/metafile.yml | SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd512_coco/pisa_ssd512_coco-247addee.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/point_rend/metafile.yml | SSD_for_PyTorch/configs/point_rend/metafile.yml | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/point_rend/metafile.yml | SSD_for_PyTorch/configs/point_rend/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.2.0/mmdet/models/detectors/point_rend.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/point_rend/metafile.yml | SSD_for_PyTorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_1x_coco/point_rend_r50_caffe_fpn_mstrain_1x_coco-1bcb5fb4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/point_rend/metafile.yml | SSD_for_PyTorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_3x_coco/point_rend_r50_caffe_fpn_mstrain_3x_coco-e0ebb6b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-t_fpn_1x_coco/retinanet_pvt-t_fpn_1x_coco_20210831_103110-17b566bd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L315 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-s_fpn_1x_coco/retinanet_pvt-s_fpn_1x_coco_20210906_142921-b6c94a5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L315 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-m_fpn_1x_coco/retinanet_pvt-m_fpn_1x_coco_20210831_103243-55effa1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L315 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b0_fpn_1x_coco/retinanet_pvtv2-b0_fpn_1x_coco_20210831_103157-13e9aabe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L543 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b1_fpn_1x_coco/retinanet_pvtv2-b1_fpn_1x_coco_20210831_103318-7e169a7d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L543 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b2_fpn_1x_coco/retinanet_pvtv2-b2_fpn_1x_coco_20210901_174843-529f0b9a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L543 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b3_fpn_1x_coco/retinanet_pvtv2-b3_fpn_1x_coco_20210903_151512-8357deff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L543 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b4_fpn_1x_coco/retinanet_pvtv2-b4_fpn_1x_coco_20210901_170151-83795c86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L543 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b5_fpn_1x_coco/retinanet_pvtv2-b5_fpn_1x_coco_20210902_201800-3420eb57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/metafile.yml | SSD_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py#L543 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvt-l_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvt-l_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvt-m_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvt-m_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvt-s_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvt-s_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvt-t_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvt-t_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvtv2-b0_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvtv2-b0_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvtv2-b1_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvtv2-b1_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvtv2-b2_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvtv2-b2_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvtv2-b3_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvtv2-b3_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvtv2-b4_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvtv2-b4_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/pvt/retinanet_pvtv2-b5_fpn_1x_coco.py | SSD_for_PyTorch/configs/pvt/retinanet_pvtv2-b5_fpn_1x_coco.py | https://github.com/whai362/PVT/ | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://openaccess.thecvf.com/content/ICCV2021/papers/Fang_Instances_As_Queries_ICCV_2021_paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/queryinst.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_1x_coco/queryinst_r50_fpn_1x_coco_20210907_084916-5a8f1998.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_mstrain_480-800_3x_coco/queryinst_r50_fpn_mstrain_480-800_3x_coco_20210901_103643-7837af86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/queryinst_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20210904_101802-85cffbd8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r101_fpn_mstrain_480-800_3x_coco/queryinst_r101_fpn_mstrain_480-800_3x_coco_20210904_104048-91f9995b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/queryinst/metafile.yml | SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/queryinst_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20210904_153621-76cce59f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_1x_coco_20200520_163141-2a9d1814.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_1x_coco/mask_rcnn_regnetx-4GF_fpn_1x_coco_20200517_180217-32e9c92d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco/mask_rcnn_regnetx-6.4GF_fpn_1x_coco_20200517_180439-3a7aae83.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-8GF_fpn_1x_coco/mask_rcnn_regnetx-8GF_fpn_1x_coco_20200517_180515-09daa87e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-12GF_fpn_1x_coco/mask_rcnn_regnetx-12GF_fpn_1x_coco_20200517_180552-b538bd8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco_20200520_172726-75f40794.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_1x_coco/faster_rcnn_regnetx-3.2GF_fpn_1x_coco_20200517_175927-126fd9bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_2x_coco/faster_rcnn_regnetx-3.2GF_fpn_2x_coco_20200520_223955-e2081918.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-800MF_fpn_1x_coco/retinanet_regnetx-800MF_fpn_1x_coco_20200517_191403-f6f91d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-1.6GF_fpn_1x_coco/retinanet_regnetx-1.6GF_fpn_1x_coco_20200517_191403-37009a9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-3.2GF_fpn_1x_coco/retinanet_regnetx-3.2GF_fpn_1x_coco_20200520_163141-cb1509e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-400MF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-400MF_fpn_mstrain_3x_coco_20210526_095112-e1967c37.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-800MF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-800MF_fpn_mstrain_3x_coco_20210526_095118-a2c70b20.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-1.6GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-1_20210526_095325-94aa46cc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-3_20210526_095152-e16a5227.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-4GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-4GF_fpn_mstrain_3x_coco_20210526_095201-65eaf841.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco_20200521_202221-99879813.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-400MF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-400MF_fpn_mstrain-poly_3x_coco_20210601_235443-8aac57a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-800MF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-800MF_fpn_mstrain-poly_3x_coco_20210602_210641-715d51f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-1.6GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-1_20210602_210641-6764cff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-1.6GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-1_20210602_210641-6e63e19c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-4GF_fpn_mstrain-poly_3x_coco_20210602_032621-00f0331c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-400MF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-400MF_fpn_mstrain_3x_coco_20210715_211619-5142f449.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-800MF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-800MF_fpn_mstrain_3x_coco_20210715_211616-dcbd13f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-1.6GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-1_20210715_211616-75f29a61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-3_20210715_211616-b9c2c58b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-4GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-4GF_fpn_mstrain_3x_coco_20210715_212034-cbb1be4c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/regnet/metafile.yml | SSD_for_PyTorch/configs/regnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/regnet.py#L11 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://arxiv.org/abs/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/reppoints_detector.py#L9 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco_20200329-c98bfa96.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_center_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_center_fpn_gn-neck%2Bhead_1x_coco_20200330-00f73d58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_1x_coco/reppoints_moment_r50_fpn_1x_coco_20200330-b73db8d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco_20200329-4b38409a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco_20200329-91babaa2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco_20200329-4fbc7310.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-3309fbf2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/reppoints/metafile.yml | SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-f87da1ea.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/faster_rcnn_r2_101_fpn_2x_coco/faster_rcnn_r2_101_fpn_2x_coco-175f1da6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/res2net.py#L239 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/mask_rcnn_r2_101_fpn_2x_coco/mask_rcnn_r2_101_fpn_2x_coco-17f061e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/res2net.py#L239 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_rcnn_r2_101_fpn_20e_coco/cascade_rcnn_r2_101_fpn_20e_coco-f4b7b7db.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/res2net.py#L239 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_mask_rcnn_r2_101_fpn_20e_coco/cascade_mask_rcnn_r2_101_fpn_20e_coco-8a7b41e1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/res2net.py#L239 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/htc_r2_101_fpn_20e_coco/htc_r2_101_fpn_20e_coco-3a8d2112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/res2net/metafile.yml | SSD_for_PyTorch/configs/res2net/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.1.0/mmdet/models/backbones/res2net.py#L239 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20200926_125502-20289c16.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201006_021058-421517f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20200926_125503-8a2c3d47.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_215831-af60cdf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201122_213640-763cc7b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201005_113242-b9459f8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201122_104428-99eca4c7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_113243-42607475.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnest/metafile.yml | SSD_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py#L273 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | SSD_for_PyTorch/configs/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | SSD_for_PyTorch/configs/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | SSD_for_PyTorch/configs/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_162229-32ae82a9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.22.0/configs/resnet_strikes_back/README.md | 开源代码说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_193636-8b9ad50f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.22.0/configs/resnet_strikes_back/README.md | 开源代码说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco/retinanet_r50_fpn_rsb-pretrain_1x_coco_20220113_175432-bd24aae9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.22.0/configs/resnet_strikes_back/README.md | 开源代码说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_174054-06ce8ba0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/metafile.yml | SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.22.0/configs/resnet_strikes_back/README.md | 开源代码说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco.py | SSD_for_PyTorch/configs/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/retinanet.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x_coco/retinanet_r18_fpn_1x_coco_20220407_171055-614fd399.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x8_1x_coco/retinanet_r18_fpn_1x8_1x_coco_20220407_171255-4ea310d7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_caffe_fpn_1x_coco/retinanet_r50_caffe_fpn_1x_coco_20200531-f11027c5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_1x_coco/retinanet_r50_fpn_1x_coco_20200130-c2398f9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/retinanet_r50_fpn_fp16_1x_coco/retinanet_r50_fpn_fp16_1x_coco_20200702-0dbfb212.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_2x_coco/retinanet_r50_fpn_2x_coco_20200131-fdb43119.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_mstrain_3x_coco/retinanet_r50_fpn_mstrain_3x_coco_20210718_220633-88476508.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_1x_coco/retinanet_r101_caffe_fpn_1x_coco_20200531-b428fa0f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_mstrain_3x_coco/retinanet_r101_caffe_fpn_mstrain_3x_coco_20210721_063439-88a8a944.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_1x_coco/retinanet_r101_fpn_1x_coco_20200130-7a93545f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_2x_coco/retinanet_r101_fpn_2x_coco_20200131-5560aee8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_mstrain_3x_coco/retinanet_r101_fpn_mstrain_3x_coco_20210720_214650-7ee888e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_1x_coco/retinanet_x101_32x4d_fpn_1x_coco_20200130-5c8b7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_2x_coco/retinanet_x101_32x4d_fpn_2x_coco_20200131-237fc5e1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_1x_coco/retinanet_x101_64x4d_fpn_1x_coco_20200130-366f5af1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_2x_coco/retinanet_x101_64x4d_fpn_2x_coco_20200131-bca068ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/retinanet/metafile.yml | SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_mstrain_3x_coco/retinanet_x101_64x4d_fpn_mstrain_3x_coco_20210719_051838-022c2187.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.4.0/mmdet/models/roi_heads/bbox_heads/sabl_head.py#L14 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r50_fpn_1x_coco/sabl_faster_rcnn_r50_fpn_1x_coco-e867595b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r101_fpn_1x_coco/sabl_faster_rcnn_r101_fpn_1x_coco-f804c6c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r50_fpn_1x_coco/sabl_cascade_rcnn_r50_fpn_1x_coco-e1748e5e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r101_fpn_1x_coco/sabl_cascade_rcnn_r101_fpn_1x_coco-2b83e87c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_1x_coco/sabl_retinanet_r50_fpn_1x_coco-6c54fd4f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_gn_1x_coco/sabl_retinanet_r50_fpn_gn_1x_coco-e16dfcf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_1x_coco/sabl_retinanet_r101_fpn_1x_coco-42026904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_1x_coco/sabl_retinanet_r101_fpn_gn_1x_coco-40a893e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco-1e63382c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sabl/metafile.yml | SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco-5342f857.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scnet/metafile.yml | SSD_for_PyTorch/configs/scnet/metafile.yml | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scnet/metafile.yml | SSD_for_PyTorch/configs/scnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.9.0/mmdet/models/detectors/scnet.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scnet/metafile.yml | SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r50_fpn_1x_coco/scnet_r50_fpn_1x_coco-c3f09857.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scnet/metafile.yml | SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r50_fpn_20e_coco/scnet_r50_fpn_20e_coco-a569f645.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scnet/metafile.yml | SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r101_fpn_20e_coco/scnet_r101_fpn_20e_coco-294e312c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scnet/metafile.yml | SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_x101_64x4d_fpn_20e_coco/scnet_x101_64x4d_fpn_20e_coco-fb09dec9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scratch/metafile.yml | SSD_for_PyTorch/configs/scratch/metafile.yml | https://arxiv.org/abs/1811.08883 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scratch/metafile.yml | SSD_for_PyTorch/configs/scratch/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/configs/scratch/faster_rcnn_r50_fpn_gn-all_scratch_6x_coco.py | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scratch/metafile.yml | SSD_for_PyTorch/configs/scratch/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scratch/faster_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_faster_rcnn_r50_fpn_gn_6x_bbox_mAP-0.407_20200201_193013-90813d01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/scratch/metafile.yml | SSD_for_PyTorch/configs/scratch/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scratch/mask_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_mask_rcnn_r50_fpn_gn_6x_bbox_mAP-0.412__segm_mAP-0.374_20200201_193051-1e190a40.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://arxiv.org/abs/2008.10032 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-a698dd3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-a1c11314.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-8e6e6dd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-a0b59c42.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-392a804b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-cd0f6a12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-e68eb464.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-1d817139.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-71e2215e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-8b5a6745.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-5d8ca2a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/seesaw_loss/metafile.yml | SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-c8551505.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/simple_copy_paste/metafile.yml | SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://arxiv.org/abs/2012.07177 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/simple_copy_paste/metafile.yml | SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/pipelines/transforms.py#L2762 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/simple_copy_paste/metafile.yml | SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco_20220324_182940-33a100c5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/simple_copy_paste/metafile.yml | SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_90k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_90k_coco_20220316_181409-f79c84c5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/simple_copy_paste/metafile.yml | SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco_20220324_201229-80ee90b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/simple_copy_paste/metafile.yml | SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_90k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_90k_coco_20220316_181307-6bc5726f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solo/metafile.yml | SSD_for_PyTorch/configs/solo/metafile.yml | https://arxiv.org/abs/1912.04488 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solo/metafile.yml | SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_r50_fpn_1x_coco/decoupled_solo_r50_fpn_1x_coco_20210820_233348-6337c589.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solo/metafile.yml | SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_r50_fpn_3x_coco/decoupled_solo_r50_fpn_3x_coco_20210821_042504-7b3301ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solo/metafile.yml | SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_light_r50_fpn_3x_coco/decoupled_solo_light_r50_fpn_3x_coco_20210906_142703-e70e226f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solo/metafile.yml | SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/solo_r50_fpn_3x_coco/solo_r50_fpn_3x_coco_20210901_012353-11d224d7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solo/metafile.yml | SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/solo_r50_fpn_1x_coco/solo_r50_fpn_1x_coco_20210821_035055-2290a6b8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://arxiv.org/abs/2003.10152 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r50_fpn_1x_coco/solov2_r50_fpn_1x_coco_20220512_125858-a357fa23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r50_fpn_3x_coco/solov2_r50_fpn_3x_coco_20220512_125856-fed092d4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r101_fpn_3x_coco/solov2_r101_fpn_3x_coco_20220511_095119-c559a076.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r101_dcn_fpn_3x_coco/solov2_r101_dcn_fpn_3x_coco_20220513_214734-16c966cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_x101_dcn_fpn_3x_coco/solov2_x101_dcn_fpn_3x_coco_20220513_214337-aef41095.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r18_fpn_3x_coco/solov2_light_r18_fpn_3x_coco_20220511_083717-75fa355b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r34_fpn_3x_coco/solov2_light_r34_fpn_3x_coco_20220511_091839-e51659d3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/solov2/metafile.yml | SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r50_fpn_3x_coco/solov2_light_r50_fpn_3x_coco_20220512_165256-c93a6074.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.9.0/mmdet/models/detectors/sparse_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_1x_coco/sparse_rcnn_r50_fpn_1x_coco_20201222_214453-dc79b137.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco_20201218_154234-7bc5c054.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_024605-9fe92701.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco_20201223_121552-6c46c9d6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/sparse_rcnn/metafile.yml | SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_023452-c23c3564.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ssd/metafile.yml | SSD_for_PyTorch/configs/ssd/metafile.yml | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ssd/metafile.yml | SSD_for_PyTorch/configs/ssd/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.14.0/mmdet/models/dense_heads/ssd_head.py#L16 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ssd/metafile.yml | SSD_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssd300_coco/ssd300_coco_20210803_015428-d231a06e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ssd/metafile.yml | SSD_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssd512_coco/ssd512_coco_20210803_022849-0a47a1ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/ssd/metafile.yml | SSD_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssdlite_mobilenetv2_scratch_600e_coco/ssdlite_mobilenetv2_scratch_600e_coco_20210629_110627-974d9307.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco.py | SSD_for_PyTorch/configs/swin/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/mask_rcnn_swin-t-p4-w7_fpn_1x_coco.py | SSD_for_PyTorch/configs/swin/mask_rcnn_swin-t-p4-w7_fpn_1x_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco.py | SSD_for_PyTorch/configs/swin/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/retinanet_swin-t-p4-w7_fpn_1x_coco.py | SSD_for_PyTorch/configs/swin/retinanet_swin-t-p4-w7_fpn_1x_coco.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco_20210903_104808-b92c91f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco_20210906_131725-bacf6f7b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_1x_coco/mask_rcnn_swin-t-p4-w7_fpn_1x_coco_20210902_120937-9d6b7cfa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_fp16_ms-crop-3x_coco/mask_rcnn_swin-t-p4-w7_fpn_fp16_ms-crop-3x_coco_20210908_165006-90a4008c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/swin/metafile.yml | SSD_for_PyTorch/configs/swin/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.16.0/mmdet/models/backbones/swin.py#L465 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://arxiv.org/abs/2108.07755 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.20.0/mmdet/models/detectors/tood.py#L7 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r101_fpn_mstrain_2x_coco/tood_r101_fpn_mstrain_2x_coco_20211210_144232-a18f53c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_x101_64x4d_fpn_mstrain_2x_coco/tood_x101_64x4d_fpn_mstrain_2x_coco_20211211_003519-a4f36113.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r101_fpn_dconv_c3-c5_mstrain_2x_coco/tood_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20211210_213728-4a824142.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_anchor_based_1x_coco/tood_r50_fpn_anchor_based_1x_coco_20211214_100105-b776c134.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_1x_coco/tood_r50_fpn_1x_coco_20211210_103425-20e20746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tood/metafile.yml | SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_mstrain_2x_coco/tood_r50_fpn_mstrain_2x_coco_20211210_144231-3b23174c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tridentnet/metafile.yml | SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://arxiv.org/abs/1901.01892 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tridentnet/metafile.yml | SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.8.0/mmdet/models/detectors/trident_faster_rcnn.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tridentnet/metafile.yml | SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_1x_coco/tridentnet_r50_caffe_1x_coco_20201230_141838-2ec0b530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tridentnet/metafile.yml | SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_1x_coco/tridentnet_r50_caffe_mstrain_1x_coco_20201230_141839-6ce55ccb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/tridentnet/metafile.yml | SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_3x_coco/tridentnet_r50_caffe_mstrain_3x_coco_20201130_100539-46d227ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.6.0/mmdet/models/detectors/vfnet.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_1x_coco/vfnet_r50_fpn_1x_coco_20201027-38db6f58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mstrain_2x_coco/vfnet_r50_fpn_mstrain_2x_coco_20201027-7cc75bd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-6879c318.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_1x_coco/vfnet_r101_fpn_1x_coco_20201027pth-c831ece7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mstrain_2x_coco/vfnet_r101_fpn_mstrain_2x_coco_20201027pth-4a5d53f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-7729adb5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-d300a6fc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/vfnet/metafile.yml | SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-b5f6da5e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolact/metafile.yml | SSD_for_PyTorch/configs/yolact/metafile.yml | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolact/metafile.yml | SSD_for_PyTorch/configs/yolact/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.5.0/mmdet/models/detectors/yolact.py#L9 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolact/metafile.yml | SSD_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r50_1x8_coco/yolact_r50_1x8_coco_20200908-f38d58df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolact/metafile.yml | SSD_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r50_8x8_coco/yolact_r50_8x8_coco_20200908-ca34f5db.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolact/metafile.yml | SSD_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r101_1x8_coco/yolact_r101_1x8_coco_20200908-4cbe9101.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.4.0/mmdet/models/detectors/yolo.py#L8 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_320_273e_coco/yolov3_d53_320_273e_coco-421362b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-416_273e_coco/yolov3_d53_mstrain-416_273e_coco-2b60fcd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-608_273e_coco/yolov3_d53_mstrain-608_273e_coco_20210518_115020-a2c3acb8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_fp16_mstrain-608_273e_coco/yolov3_d53_fp16_mstrain-608_273e_coco_20210517_213542-4bc34944.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_mobilenetv2_320_300e_coco/yolov3_mobilenetv2_320_300e_coco_20210719_215349-d18dff72.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolo/metafile.yml | SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_mobilenetv2_mstrain-416_300e_coco/yolov3_mobilenetv2_mstrain-416_300e_coco_20210718_010823-f68a07b3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolof/metafile.yml | SSD_for_PyTorch/configs/yolof/metafile.yml | https://arxiv.org/abs/2103.09460 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolof/metafile.yml | SSD_for_PyTorch/configs/yolof/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.12.0/mmdet/models/detectors/yolof.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolof/metafile.yml | SSD_for_PyTorch/configs/yolof/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427-8e864411.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolox/metafile.yml | SSD_for_PyTorch/configs/yolox/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolox/metafile.yml | SSD_for_PyTorch/configs/yolox/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.15.1/mmdet/models/detectors/yolox.py#L6 | 代码链接 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolox/metafile.yml | SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_s_8x8_300e_coco/yolox_s_8x8_300e_coco_20211121_095711-4592a793.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolox/metafile.yml | SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_l_8x8_300e_coco/yolox_l_8x8_300e_coco_20211126_140236-d3bd2b23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolox/metafile.yml | SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_x_8x8_300e_coco/yolox_x_8x8_300e_coco_20211126_140254-1ef88d67.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/yolox/metafile.yml | SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_tiny_8x8_300e_coco/yolox_tiny_8x8_300e_coco_20211124_171234-b4047906.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docker/Dockerfile | SSD_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docker/Dockerfile | SSD_for_PyTorch/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docker/serve/Dockerfile | SSD_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 下载三方库 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/en/conf.py | SSD_for_PyTorch/docs/en/conf.py | https://github.com/open-mmlab/mmdetection | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/en/stat.py | SSD_for_PyTorch/docs/en/stat.py | https://github.com/open-mmlab/mmdetection/blob/master/configs | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/conf.py | SSD_for_PyTorch/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmdetection | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/stat.py | SSD_for_PyTorch/docs/zh_cn/stat.py | https://github.com/open-mmlab/mmdetection/blob/master/ | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/lvis.py | SSD_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/setup.py | SSD_for_PyTorch/setup.py | https://github.com/open-mmlab/mmdetection | 开源代码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/setup.py | SSD_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tests/test_runtime/async_benchmark.py | SSD_for_PyTorch/tests/test_runtime/async_benchmark.py | https://download.openmmlab.com/mmdetection/v2.0 | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/test2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/annotations/ | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | https://s3-us-west-2.amazonaws.com/dl.fbaipublicfiles.com/LVIS/lvis_v1_train.json.zip | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | https://s3-us-west-2.amazonaws.com/dl.fbaipublicfiles.com/LVIS/lvis_v1_train.json.zip | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/misc/download_dataset.py | SSD_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCdevkit_08-Jun-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/hook/wandblogger_hook.py | SSD_for_PyTorch/mmdet/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/artifacts/model-versioning | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/make.bat | SSD_for_PyTorch/docs/zh_cn/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/utils/util_distribution.py | SSD_for_PyTorch/mmdet/utils/util_distribution.py | https://pytorch.org/docs/stable/generated/torch.nn.parallel | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/detectors/two_stage.py | https://mmdetection.readthedocs.io/en/latest/tutorials/pytorch2onnx.html#list-of-supported-models-exportable-to-onnx | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/retina_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/mask/structures.py | https://gitlab.kitware.com/computer-vision/kwimage/-/blob/928cae35ca8/kwimage/structs/polygon.py#L379 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/utils/util_mixins.py | SSD_for_PyTorch/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/grid_roi_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/reppoints_detector.py | SSD_for_PyTorch/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/analysis_tools/optimize_anchors.py | SSD_for_PyTorch/tools/analysis_tools/optimize_anchors.py | https://github.com/AlexeyAB/darknet/blob/master/src/detector.c | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/visualization/image.py | SSD_for_PyTorch/mmdet/core/visualization/image.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/mask/structures.py | SSD_for_PyTorch/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/losses/cross_entropy_loss.py | https://github.com/pytorch/pytorch/blob/56b43f4fec1f76953f15a627694d4bba34588969/torch/nn/functional.py#L2660 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/apis/train.py | https://github.com/open-mmlab/mmcv/pull/1193 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/backbones/pvt.py | SSD_for_PyTorch/mmdet/models/backbones/pvt.py | https://arxiv.org/pdf/2102.12122.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/tood_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/tood_head.py | https://github.com/open-mmlab/mmdetection/pull/6268/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/demodata.py | SSD_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/pipelines/transforms.py | SSD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/utils/replace_cfg_vals.py | https://github.com/microsoft/SoftTeacher/blob/main/ssod/utils/vars.py | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/backbones/swin.py | SSD_for_PyTorch/mmdet/models/backbones/swin.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/plugins/dropblock.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/samplers/infinite_sampler.py | SSD_for_PyTorch/mmdet/datasets/samplers/infinite_sampler.py | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/samplers/grouped_batch_sampler.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/samplers/mask_sampling_result.py | SSD_for_PyTorch/mmdet/core/bbox/samplers/mask_pseudo_sampler.py | https://github.com/ZwwWayne/K-Net/blob/main/knet/det/mask_pseudo_sampler.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/samplers/class_aware_sampler.py | SSD_for_PyTorch/mmdet/datasets/samplers/class_aware_sampler.py | https://github.com/Sense-X/TSD/blob/master/mmdet/datasets/samplers/distributed_classaware_sampler.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/backbones/hourglass.py | SSD_for_PyTorch/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/mask/structures.py | SSD_for_PyTorch/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/iou_loss.py | SSD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/hook/wandblogger_hook.py | SSD_for_PyTorch/mmdet/core/hook/wandblogger_hook.py | https://github.com/wandb/client/issues/2837 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/builder.py | SSD_for_PyTorch/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/paa_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/iou_loss.py | SSD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/iou_loss.py | SSD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/necks/rfp.py | SSD_for_PyTorch/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/fast_rcnn.py | SSD_for_PyTorch/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/tood_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/fovea_head.py | https://github.com/open-mmlab/mmdetection/pull/6268/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/apis/train.py | SSD_for_PyTorch/mmdet/apis/train.py | https://github.com/open-mmlab/mmdetection/issues/6339 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/cascade_rcnn.py | SSD_for_PyTorch/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/utils/transformer.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/dense_heads/deformable_detr_head.py#L241 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://github.com/Megvii- | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SSD_for_PyTorch/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/utils/split_batch.py | https://github.com/microsoft/SoftTeacher/blob/main/ssod/utils/structure_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SSD_for_PyTorch/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/necks/dyhead.py | SSD_for_PyTorch/mmdet/models/necks/dyhead.py | https://github.com/jshilong/SEPC | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/visualization/image.py | https://github.com/opencv/opencv-python/issues/46 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/cascade_roi_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/kd_one_stage.py | SSD_for_PyTorch/mmdet/models/detectors/kd_one_stage.py | https://arxiv.org/abs/1503.02531 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | SSD_for_PyTorch/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/conf.py | SSD_for_PyTorch/docs/zh_cn/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/necks/rfp.py | SSD_for_PyTorch/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/hook/wandblogger_hook.py | SSD_for_PyTorch/mmdet/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/integrations/mmdetection | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/detr.py | SSD_for_PyTorch/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmcv_need/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/tood_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/reppoints_head.py | https://github.com/open-mmlab/mmdetection/pull/6268/ | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | SSD_for_PyTorch/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/assigners/atss_assigner.py | SSD_for_PyTorch/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/deployment/test.py | SSD_for_PyTorch/tools/deployment/onnx2tensorrt.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/utils/dist_utils.py | https://github.com/Megvii- | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/corner_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/backbones/swin.py | SSD_for_PyTorch/mmdet/models/backbones/swin.py | https://arxiv.org/abs/2103.14030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/voc.py | SSD_for_PyTorch/mmdet/core/evaluation/bbox_overlaps.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCdevkit_18-May-2011.tar | 数据集地址 | -| 开发引入 | / | SSD_for_PyTorch/configs/strong_baselines/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_lsj_100e_coco.py | https://github.com/pytorch/pytorch/issues/36530 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/strong_baselines/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_lsj_100e_coco.py | SSD_for_PyTorch/configs/strong_baselines/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_lsj_100e_coco.py | https://github.com/open-mmlab/mmcv/pull/1205 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/yolo_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/yolo_head.py | https://github.com/ultralytics/yolov3 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/configs/strong_baselines/mask_rcnn_r50_caffe_fpn_syncbn-all_rpn-2conv_lsj_100e_coco.py | https://github.com/pytorch/pytorch/issues/36530 | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/configs/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco.py | https://github.com/pytorch/pytorch/issues/36530 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/make.bat | SSD_for_PyTorch/docs/en/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/export/onnx_helper.py | https://github.com/NVIDIA/TensorRT/issues/1134 | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/dense_heads/yolo_head.py | https://github.com/NVIDIA/TensorRT/issues/1134 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/deployment/test.py | SSD_for_PyTorch/tools/deployment/test.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/detr.py | SSD_for_PyTorch/mmdet/models/dense_heads/detr_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/setup.py | SSD_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/solov2_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/solov2_head.py | https://arxiv.org/pdf/2003.10152 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/utils/se_layer.py | SSD_for_PyTorch/mmdet/models/utils/se_layer.py | https://github.com/microsoft/DynamicHead/blob/master/dyhead/dyrelu.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/voc.py | SSD_for_PyTorch/mmdet/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCdevkit_18-May-2011.tar | 数据集地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/sparse_roi_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/sparse_roi_head.py | http://arxiv.org/abs/2105.01928 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/deformable_detr_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/deformable_detr_head.py | https://arxiv.org/abs/2010.04159 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/configs/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco.py | https://github.com/pytorch/pytorch/issues/36530 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/assigners/max_iou_assigner.py | SSD_for_PyTorch/mmdet/core/bbox/assigners/max_iou_assigner.py | https://github.com/open-mmlab/mmdetection/pull/7464 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/backbones/pvt.py | SSD_for_PyTorch/mmdet/models/backbones/pvt.py | https://arxiv.org/pdf/2106.13797.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/necks/dyhead.py | SSD_for_PyTorch/mmdet/models/necks/dyhead.py | https://github.com/microsoft/DynamicHead/issues/25 | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/fcos_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/datasets/coco.py | https://github.com/cocodataset/cocoapi/blob/master/PythonAPI/pycocotools/coco.py#L331 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/iou_loss.py | SSD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/tools/deployment/test.py | SSD_for_PyTorch/tools/deployment/pytorch2onnx.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | SSD_for_PyTorch/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/sparse_roi_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/mask_heads/dynamic_mask_head.py | http://arxiv.org/abs/2105.01928 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/detr.py | SSD_for_PyTorch/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/wider_face.py | SSD_for_PyTorch/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/lvis.py | SSD_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/data-vis | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/utils/memory.py | SSD_for_PyTorch/mmdet/utils/memory.py | https://github.com/facebookresearch/detectron2/blob/main/detectron2/utils/memory.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/sparse_roi_head.py | SSD_for_PyTorch/mmdet/models/detectors/queryinst.py | http://arxiv.org/abs/2105.01928 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/mask/structures.py | SSD_for_PyTorch/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/1709283/how-can-i-sort-a-coordinate-list-for-a-rectangle-counterclockwise | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/paa_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/yolact_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/yolact_head.py | https://github.com/open-mmlab/mmdetection/issues/5978 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/lad.py | SSD_for_PyTorch/mmdet/models/dense_heads/lad_head.py | https://arxiv.org/pdf/2108.10520.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/strong_baselines/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_lsj_100e_coco.py | SSD_for_PyTorch/configs/strong_baselines/mask_rcnn_r50_caffe_fpn_syncbn-all_rpn-2conv_lsj_100e_coco.py | https://github.com/open-mmlab/mmcv/pull/1205 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/grid_roi_head.py | SSD_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/get_started.md | SSD_for_PyTorch/Dockerfile | https://github.com/open-mmlab/mmcv.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/utils/transformer.py | SSD_for_PyTorch/mmdet/models/utils/transformer.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/utils/misc.py | SSD_for_PyTorch/mmdet/utils/misc.py | https://github.com/microsoft/SoftTeacher | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/.dev_scripts/check_links.py | SSD_for_PyTorch/.dev_scripts/check_links.py | https://github.com/allenai/allennlp/blob/main/scripts/check_links.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/samplers/class_aware_sampler.py | SSD_for_PyTorch/mmdet/datasets/samplers/class_aware_sampler.py | https://arxiv.org/abs/1512.05830 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/detr.py | SSD_for_PyTorch/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/dice_loss.py | SSD_for_PyTorch/mmdet/models/losses/dice_loss.py | https://arxiv.org/abs/1606.04797 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/utils/se_layer.py | https://arxiv.org/abs/2003.10027 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/double_roi_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/iou_calculators/iou2d_calculator.py | SSD_for_PyTorch/mmdet/core/bbox/iou_calculators/iou2d_calculator.py | https://github.com/open-mmlab/mmdetection/pull/4889 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/utils/setup_env.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/tood_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/gfl_head.py | https://github.com/open-mmlab/mmdetection/pull/6268/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | SSD_for_PyTorch/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/tood_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/base_dense_head.py | https://github.com/open-mmlab/mmdetection/pull/6268/ | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/utils/util_random.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/mask/structures.py | SSD_for_PyTorch/mmdet/core/mask/structures.py | https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.truncnorm.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/dataset_wrappers.py | SSD_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/samplers/mask_sampling_result.py | SSD_for_PyTorch/mmdet/core/bbox/samplers/mask_sampling_result.py | https://github.com/ZwwWayne/K-Net/blob/main/knet/det/mask_pseudo_sampler.py | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/utils/se_layer.py | SSD_for_PyTorch/mmdet/models/necks/dyhead.py | https://github.com/microsoft/DynamicHead/blob/master/dyhead/dyrelu.py | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/dense_heads/base_dense_head.py | https://github.com/NVIDIA/TensorRT/issues/1134 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/balanced_l1_loss.py | SSD_for_PyTorch/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/paa.py | SSD_for_PyTorch/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/necks/dyhead.py | SSD_for_PyTorch/mmdet/models/necks/dyhead.py | https://github.com/microsoft/DynamicHead | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/hook/wandblogger_hook.py | SSD_for_PyTorch/mmdet/core/hook/wandblogger_hook.py | https://docs.wandb.ai/ref/python/init | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/assigners/max_iou_assigner.py | SSD_for_PyTorch/mmdet/core/bbox/assigners/ascend_max_iou_assigner.py | https://github.com/open-mmlab/mmdetection/pull/7464 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/backbones/resnet.py | SSD_for_PyTorch/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/core/data_structures/instance_data.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/structures/instances.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/apis/train.py | SSD_for_PyTorch/mmdet/apis/train.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/get_started.md | SSD_for_PyTorch/mmdet/datasets/lvis.py | https://github.com/lvis-dataset/lvis-api.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/configs/instaboost/README.md | SSD_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/utils/gaussian_target.py | SSD_for_PyTorch/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/balanced_l1_loss.py | SSD_for_PyTorch/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/compatibility.md | SSD_for_PyTorch/mmdet/models/backbones/pvt.py | https://github.com/open-mmlab/mmcv/pull/1418 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/pipelines/transforms.py | SSD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/detectors/lad.py | SSD_for_PyTorch/mmdet/models/detectors/lad.py | https://arxiv.org/pdf/2108.10520.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/samplers/random_sampler.py | SSD_for_PyTorch/mmdet/core/bbox/samplers/random_sampler.py | https://github.com/open-mmlab/mmdetection/pull/5014 | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/ddod_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/ddod_head.py | https://arxiv.org/abs/2107.02963 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/tood_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/sabl_retina_head.py | https://github.com/open-mmlab/mmdetection/pull/6268/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/samplers/class_aware_sampler.py | SSD_for_PyTorch/mmdet/datasets/samplers/class_aware_sampler.py | https://github.com/wutong16/DistributionBalancedLoss/blob/master/mllt/datasets/loader/sampler.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/pipelines/transforms.py | SSD_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开发引入 | / | SSD_for_PyTorch/Dockerfile | https://github.com/pytorch/vision.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/utils/transformer.py | SSD_for_PyTorch/mmdet/models/utils/transformer.py | https://github.com/PeizeSun/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/docs/zh_cn/conf.py | SSD_for_PyTorch/docs/en/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/iou_loss.py | SSD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/datasets/pipelines/auto_augment.py | SSD_for_PyTorch/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/losses/iou_loss.py | SSD_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/open-mmlab/mmdetection/pull/5191 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/roi_heads/dynamic_roi_head.py | SSD_for_PyTorch/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/models/dense_heads/deformable_detr_head.py | SSD_for_PyTorch/mmdet/models/dense_heads/deformable_detr_head.py | https://github.com/fundamentalvision/Deformable-DETR | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/v2.25.0/mmdet/core/bbox/samplers/ohem_sampler.py | SSD_for_PyTorch/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开发引入 | / | SSD_for_PyTorch/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | -| 开发引入 | / | SSD_for_PyTorch/requirements/tests.txt | https://github.com/open-mmlab/mmtracking#egg=mmtrack | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/.circleci/config.yml | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/.circleci/config.yml | https://download.openmmlab.com/mmcv/dist/cpu/torch<< parameters.torch >>/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/.dev_scripts/gather_models.py | https://download.openmmlab.com/mmdetection/v2.0/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/atss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/atss/atss_r50_fpn_1x_coco/atss_r50_fpn_1x_coco_20200209-985f7bd0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/atss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/atss/atss_r101_fpn_1x_coco/atss_r101_fpn_1x_20200825-dfcadd6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/atss/metafile.yml | https://arxiv.org/abs/1912.02424 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/autoassign/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/autoassign/auto_assign_r50_fpn_1x_coco/auto_assign_r50_fpn_1x_coco_20210413_115540-5e17991f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/autoassign/metafile.yml | https://arxiv.org/abs/2007.03496 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/carafe/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/carafe/mask_rcnn_r50_fpn_carafe_1x_coco/mask_rcnn_r50_fpn_carafe_1x_coco_bbox_mAP-0.393__segm_mAP-0.358_20200503_135957-8687f195.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/carafe/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/carafe/faster_rcnn_r50_fpn_carafe_1x_coco/faster_rcnn_r50_fpn_carafe_1x_coco_bbox_mAP-0.386_20200504_175733-385a75b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/carafe/metafile.yml | https://arxiv.org/abs/1905.02188 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_20e_coco/cascade_rcnn_x101_64x4d_fpn_20e_coco_20200509_224357-051557b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_1x_coco/cascade_rcnn_x101_64x4d_fpn_1x_coco_20200515_075702-43ce6a30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_20e_coco/cascade_rcnn_x101_32x4d_fpn_20e_coco_20200906_134608-9ae0a720.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_1x_coco/cascade_rcnn_x101_32x4d_fpn_1x_coco_20200316-95c2deb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_20e_coco/cascade_rcnn_r50_fpn_20e_coco_bbox_mAP-0.41_20200504_175131-e9872a90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_1x_coco/cascade_rcnn_r50_fpn_1x_coco_20200316-3dc56deb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_caffe_fpn_1x_coco/cascade_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.404_20200504_174853-b857be87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_20e_coco/cascade_rcnn_r101_fpn_20e_coco_bbox_mAP-0.425_20200504_231812-5057dcc5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_1x_coco/cascade_rcnn_r101_fpn_1x_coco_20200317-0b6a2fbf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_caffe_fpn_1x_coco/cascade_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.423_20200504_175649-cab8dbd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210719_210311-d3e64ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco_20200512_161033-bdb5126a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco_20200203-9a2db89d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210719_180640-9ff7e76f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210706_225234-40773067.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco_20200528_083917-ed1f4751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco_20200201-0f411b1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco_20210628_164719-5bdc3824.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_1x_coco/cascade_mask_rcnn_r50_fpn_1x_coco_20200203-9d4dcb24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210707_002651-6e29b3a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_1x_coco/cascade_mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.412__segm_mAP-0.36_20200504_174659-5004b251.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco_20210628_165236-51a2d363.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_20e_coco/cascade_mask_rcnn_r101_fpn_20e_coco_bbox_mAP-0.434__segm_mAP-0.378_20200504_174836-005947da.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_1x_coco/cascade_mask_rcnn_r101_fpn_1x_coco_20200203-befdf6ee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210707_002620-a5bd2389.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_1x_coco/cascade_mask_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.432__segm_mAP-0.376_20200504_174813-5c1e9599.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rpn/crpn_faster_rcnn_r50_caffe_fpn_1x_coco/crpn_faster_rcnn_r50_caffe_fpn_1x_coco-c8283cca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rpn/crpn_fast_rcnn_r50_caffe_fpn_1x_coco/crpn_fast_rcnn_r50_caffe_fpn_1x_coco-cb486e66.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cascade_rpn/metafile.yml | https://arxiv.org/abs/1909.06720 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/centernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_dcnv2_140e_coco/centernet_resnet18_dcnv2_140e_coco_20210702_155131-c8cd631f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/centernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_140e_coco/centernet_resnet18_140e_coco_20210705_093630-bb5b3bf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/centernet/metafile.yml | https://arxiv.org/abs/1904.07850 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/centripetalnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centripetalnet/centripetalnet_hourglass104_mstest_16x6_210e_coco/centripetalnet_hourglass104_mstest_16x6_210e_coco_20200915_204804-3ccc61e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/centripetalnet/metafile.yml | https://arxiv.org/abs/2003.09119 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco_20220426_154953-050731f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco_20220509_204200-8f07c40b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco_20220510_201004-3d24f5a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_8x6_210e_coco/cornernet_hourglass104_mstest_8x6_210e_coco_20200825_150618-79b44c30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_32x3_210e_coco/cornernet_hourglass104_mstest_32x3_210e_coco_20200819_203110-1efaea91.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_10x5_210e_coco/cornernet_hourglass104_mstest_10x5_210e_coco_20200824_185720-5fefbf1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/cornernet/metafile.yml | https://arxiv.org/abs/1808.01244 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_fp16_dconv_c3-c5_1x_coco_20210520_180247-c06429d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200203-4d9ad43b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200216-a71f5bce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco_20200203-4f85c69c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dpool_1x_coco/faster_rcnn_r50_fpn_dpool_1x_coco_20200307-90d3c01d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-d68aed1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-1377f13d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-2f1fca44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-3b2f0594.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco-e75f90c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200202-42e767a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200204-df0c5f10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcn/metafile.yml | https://arxiv.org/abs/1703.06211 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_fp16_mdconv_c3-c5_1x_coco_20210520_180434-cf8fefa5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200203-ad97591f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdpool_1x_coco/faster_rcnn_r50_fpn_mdpool_1x_coco_20200307-c0df27ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco_20200130-01262257.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200130-d099253b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dcnv2/metafile.yml | https://arxiv.org/abs/1811.11168 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ddod/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ddod/ddod_r50_fpn_1x_coco/ddod_r50_fpn_1x_coco_20220523_223737-29b2fc67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ddod/metafile.yml | https://arxiv.org/pdf/2107.02963.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_twostage_refine_r50_16x2_50e_coco/deformable_detr_twostage_refine_r50_16x2_50e_coco_20210419_220613-9d28ab72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_refine_r50_16x2_50e_coco/deformable_detr_refine_r50_16x2_50e_coco_20210419_220503-5f5dff21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_r50_16x2_50e_coco/deformable_detr_r50_16x2_50e_coco_20210419_220030-a12b9512.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/deformable_detr/metafile.yml | https://openreview.net/forum?id=gZ9hCDWe6ke | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_sac_1x_coco/htc_r50_sac_1x_coco-bfa60c54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_rfp_1x_coco/htc_r50_rfp_1x_coco-8ff87c51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_htc_r50_1x_coco/detectors_htc_r50_1x_coco-329b1453.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_cascade_rcnn_r50_1x_coco/detectors_cascade_rcnn_r50_1x_coco-32a10ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_sac_1x_coco/cascade_rcnn_r50_sac_1x_coco-24bfda62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_rfp_1x_coco/cascade_rcnn_r50_rfp_1x_coco-8cf51bfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detectors/metafile.yml | https://arxiv.org/abs/2006.02334 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detr/detr_r50_8x2_150e_coco/detr_r50_8x2_150e_coco_20201130_194835-2c4b8974.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/detr/metafile.yml | https://arxiv.org/abs/2005.12872 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/double_heads/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/double_heads/dh_faster_rcnn_r50_fpn_1x_coco/dh_faster_rcnn_r50_fpn_1x_coco_20200130-586b67df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/double_heads/metafile.yml | https://arxiv.org/pdf/1904.06493 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco_20220509_100315-bc5b6516.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_r50_fpn_dyhead_for_reproduction_1x_coco/atss_r50_fpn_dyhead_for_reproduction_4x4_1x_coco_20220107_213939-162888e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_r50_fpn_dyhead_4x4_1x_coco/atss_r50_fpn_dyhead_4x4_1x_coco_20211219_023314-eaa620c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dyhead/metafile.yml | https://arxiv.org/abs/2106.08322 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dynamic_rcnn/dynamic_rcnn_r50_fpn_1x/dynamic_rcnn_r50_fpn_1x-62a3f276.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://arxiv.org/pdf/2004.06002 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco/retinanet_effb3_fpn_crop896_8x4_1x_coco_20220322_234806-615a0dda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/efficientnet/metafile.yml | https://arxiv.org/abs/1905.11946v5 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa_in1k_20220119-5b4887a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco_20200130-8b2523a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_1x_coco/faster_rcnn_r50_fpn_attention_1111_1x_coco_20200130-403cccba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco_20200130-1a2e831d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_1x_coco/faster_rcnn_r50_fpn_attention_0010_1x_coco_20200130-7cb0c14d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/empirical_attention/metafile.yml | https://arxiv.org/pdf/1904.05873 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco.py | https://download.pytorch.org/models/resnet50-11ad3fa6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/faster_rcnn_r50_fpn_fp16_1x_coco/faster_rcnn_r50_fpn_fp16_1x_coco_20200204-d4dc1471.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210524_124528-26c63de6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_2x_coco/faster_rcnn_x101_64x4d_fpn_2x_coco_20200512_161033-5961fa95.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_1x_coco/faster_rcnn_x101_64x4d_fpn_1x_coco_20200204-833ee192.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210604_182954-002e082a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210524_124151-16b9b260.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_2x_coco/faster_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.412_20200506_041400-64a12c0b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_1x_coco/faster_rcnn_x101_32x4d_fpn_1x_coco_20200203-cff10310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_mstrain_3x_coco/faster_rcnn_r50_fpn_mstrain_3x_coco_20210524_110822-e10bd31c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_2x_coco/faster_rcnn_r50_fpn_2x_coco_bbox_mAP-0.384_20200504_210434-a5d8aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_iou_1x_coco-fdd207f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_giou_1x_coco-0eada910.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_bounded_iou_1x_coco-98ad993b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210526_095054-1f77628b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco_bbox_mAP-0.397_20200504_231813-10b2de58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_1x_coco/faster_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.378_20200504_180032-c5925ee5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco_20201028_002107-34a53b2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco_20201028_233851-b33d21b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_1x_coco/faster_rcnn_r50_caffe_dc5_1x_coco_20201030_151909-531f0f43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_mstrain_1x_coco/faster_rcnn_r50_caffe_c4_mstrain_1x_coco_20220316_150527-db276fed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_1x_coco/faster_rcnn_r50_caffe_c4_1x_coco_20220316_150152-3f885b85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_mstrain_3x_coco/faster_rcnn_r101_fpn_mstrain_3x_coco_20210524_110822-4d4d2ca8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_2x_coco/faster_rcnn_r101_fpn_2x_coco_bbox_mAP-0.398_20200504_210455-1d2dac9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_1x_coco/faster_rcnn_r101_fpn_1x_coco_20200130-f513f705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210526_095742-a7ae426d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_1x_coco/faster_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.398_20200504_180057-b269e9dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco_20220320_085147-efedfda4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/faster_rcnn/metafile.yml | https://arxiv.org/abs/1506.01497 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco-ede514a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco-d92ceeea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_1x_coco/fcos_r50_caffe_fpn_gn-head_1x_coco-821213aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco-511424d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_1x_coco/fcos_r101_caffe_fpn_gn-head_1x_coco-0e37b982.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco-ae4d8b3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco-0a0d75a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fcos/metafile.yml | https://arxiv.org/abs/1904.01355 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_2x_coco/fovea_r50_fpn_4x4_2x_coco_20200203-2df792b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_1x_coco/fovea_r50_fpn_4x4_1x_coco_20200219-ee4d5303.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_2x_coco/fovea_r101_fpn_4x4_2x_coco_20200208-02320ea4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_1x_coco/fovea_r101_fpn_4x4_1x_coco_20200219-05e38f1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200205-85ce26cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_4x4_2x_coco/fovea_align_r50_fpn_gn-head_4x4_2x_coco_20200203-8987880d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200208-649c5eb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_4x4_2x_coco/fovea_align_r101_fpn_gn-head_4x4_2x_coco_20200208-c39a027a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/foveabox/metafile.yml | https://arxiv.org/abs/1904.03797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg_crop640_50e_coco/retinanet_r50_fpg_crop640_50e_coco_20220311_110809-b0bcf5f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg-chn128_crop640_50e_coco/mask_rcnn_r50_fpg-chn128_crop640_50e_coco_20220311_011859-043c9b4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg_crop640_50e_coco/mask_rcnn_r50_fpg_crop640_50e_coco_20220311_011857-233b8334.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco/faster_rcnn_r50_fpg-chn128_crop640_50e_coco_20220311_011857-9376aa9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg_crop640_50e_coco/faster_rcnn_r50_fpg_crop640_50e_coco_20220311_011856-74109f42.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg-chn128_crop640_50e_coco/retinanet_r50_fpg-chn128_crop640_50e_coco_20220313_104829-ee99a686.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fpg/metafile.yml | https://arxiv.org/abs/2004.03580 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_x101_32x4d_fpn_1x_coco/retinanet_free_anchor_x101_32x4d_fpn_1x_coco_20200130-d4846968.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r50_fpn_1x_coco/retinanet_free_anchor_r50_fpn_1x_coco_20200130-0f67375f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r101_fpn_1x_coco/retinanet_free_anchor_r101_fpn_1x_coco_20200130-358324e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/free_anchor/metafile.yml | https://arxiv.org/abs/1909.02466 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_x101_64x4d_fpn_1x_coco/fsaf_x101_64x4d_fpn_1x_coco-e3f6e6fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r50_fpn_1x_coco/fsaf_r50_fpn_1x_coco-94ccc51f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r101_fpn_1x_coco/fsaf_r101_fpn_1x_coco-9e71098f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/fsaf/metafile.yml | https://arxiv.org/abs/1903.00621 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200212-68164964.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-cbed3d2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200211-7584841c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200202-50b90e5c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200202-587b99aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco_20200202-bb3eb55c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco_20200204-17235656.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco_20200515_211915-187da160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200206-8407a3f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200207-945e77ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco_20200210-81658c8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco_20200206-af22dc9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco_20200205-e58ae947.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200703_180653-ed035291.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-10bf2463.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco_20210615_161851-720338ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco_20210615_215648-44aa598a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco_20210615_211019-abbc39ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200310-d5ad2a5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gcnet/metafile.yml | https://arxiv.org/abs/1904.11492 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_mstrain_2x_coco/gfl_x101_32x4d_fpn_mstrain_2x_coco_20200630_102002-50c1ffdb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco_20200630_102002-14a2bf25.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_mstrain_2x_coco/gfl_r50_fpn_mstrain_2x_coco_20200629_213802-37bb1edc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_1x_coco/gfl_r50_fpn_1x_coco_20200629_121244-25944287.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gfl/metafile.yml | https://arxiv.org/abs/2006.04388 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_64x4d_fpn_1x_coco/retinanet_ghm_x101_64x4d_fpn_1x_coco_20200131-dd381cef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_32x4d_fpn_1x_coco/retinanet_ghm_x101_32x4d_fpn_1x_coco_20200131-e4333bd0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r50_fpn_1x_coco/retinanet_ghm_r50_fpn_1x_coco_20200130-a437fda3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r101_fpn_1x_coco/retinanet_ghm_r101_fpn_1x_coco_20200130-c148ee8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ghm/metafile.yml | https://arxiv.org/abs/1811.05181 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco_20200225-542aefbc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco_20200207-20d3e849.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_3x_coco/mask_rcnn_r50_fpn_gn-all_3x_coco_20200214-8b23b1e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_2x_coco/mask_rcnn_r50_fpn_gn-all_2x_coco_20200206-8eee02a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_3x_coco/mask_rcnn_r101_fpn_gn-all_3x_coco_20200513_181609-0df864f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_2x_coco/mask_rcnn_r101_fpn_gn-all_2x_coco_20200205-d96b1b50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn/metafile.yml | https://arxiv.org/abs/1803.08494 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco_20200216-649fdb6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200226-969bcb2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco_20200319-33fb95b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200316-e6cd35ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_2x_coco/mask_rcnn_r50_fpn_gn_ws-all_2x_coco_20200226-16acb762.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco_20200213-487d1283.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_2x_coco/mask_rcnn_r101_fpn_gn_ws-all_2x_coco_20200212-ea357cd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco_20200213-57b5a50f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco_20200203-839c5d9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco_20200212-27da1bc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r50_fpn_gn_ws-all_1x_coco/faster_rcnn_r50_fpn_gn_ws-all_1x_coco_20200130-613d9fe2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r101_fpn_gn_ws-all_1x_coco/faster_rcnn_r101_fpn_gn_ws-all_1x_coco_20200205-a93b0d75.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/gn+ws/metafile.yml | https://arxiv.org/abs/1903.10520 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco_20200204-ec76a754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco_20200130-d8f0e3ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r50_fpn_gn-head_2x_coco/grid_rcnn_r50_fpn_gn-head_2x_coco_20200130-6cca8223.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r101_fpn_gn-head_2x_coco/grid_rcnn_r101_fpn_gn-head_2x_coco_20200309-d6eca030.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/grid_rcnn/metafile.yml | https://arxiv.org/abs/1906.05688 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200604_211715-42eb79e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_groie_1x_coco/mask_rcnn_r50_fpn_groie_1x_coco_20200604_211715-50d90c74.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200607_224507-8daae01c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/faster_rcnn_r50_fpn_groie_1x_coco/faster_rcnn_r50_fpn_groie_1x_coco_20200604_211715-66ee9516.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/groie/metafile.yml | https://arxiv.org/abs/2004.13665 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_x101_64x4d_fpn_1x_coco/ga_rpn_x101_64x4d_fpn_1x_coco_20200225-3c6e1aa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_x101_32x4d_fpn_1x_coco/ga_rpn_x101_32x4d_fpn_1x_coco_20200220-c28d1b18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_r50_caffe_fpn_1x_coco/ga_rpn_r50_caffe_fpn_1x_coco_20200531-899008a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_r101_caffe_fpn_1x_coco/ga_rpn_r101_caffe_fpn_1x_coco_20200531-ca9ba8fb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_64x4d_fpn_1x_coco/ga_retinanet_x101_64x4d_fpn_1x_coco_20200226-ef9f7f1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_32x4d_fpn_1x_coco/ga_retinanet_x101_32x4d_fpn_1x_coco_20200219-40c56caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r50_caffe_fpn_1x_coco/ga_retinanet_r50_caffe_fpn_1x_coco_20201020-39581c6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r101_caffe_fpn_1x_coco/ga_retinanet_r101_caffe_fpn_1x_coco_20200531-6266453c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_64x4d_fpn_1x_coco/ga_faster_x101_64x4d_fpn_1x_coco_20200215-0fa7bde7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_32x4d_fpn_1x_coco/ga_faster_x101_32x4d_fpn_1x_coco_20200215-1ded9da3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r50_caffe_fpn_1x_coco/ga_faster_r50_caffe_fpn_1x_coco_20200702_000718-a11ccfe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r101_caffe_fpn_1x_coco/ga_faster_r101_caffe_fpn_1x_coco_bbox_mAP-0.415_20200505_115528-fb82e499.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/guided_anchoring/metafile.yml | https://arxiv.org/abs/1901.03278 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_2x_coco/mask_rcnn_hrnetv2p_w40_2x_coco_20200512_163732-aed5e4ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco/mask_rcnn_hrnetv2p_w40_1x_coco_20200511_015646-66738b35.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_2x_coco/mask_rcnn_hrnetv2p_w32_2x_coco_20200213-45b75b4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_1x_coco/mask_rcnn_hrnetv2p_w32_1x_coco_20200207-b29f616e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_2x_coco/mask_rcnn_hrnetv2p_w18_2x_coco_20200212-b3c825b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_1x_coco/mask_rcnn_hrnetv2p_w18_1x_coco_20200205-1c3d78ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w40_20e_coco/htc_hrnetv2p_w40_20e_coco_20200529_183411-417c4d5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w32_20e_coco/htc_hrnetv2p_w32_20e_coco_20200207-7639fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w18_20e_coco/htc_hrnetv2p_w18_20e_coco_20200210-b266988c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco_20201212_124752-f22d2ce5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco_20201212_090846-b6f2b49f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco_20201212_112133-77b6b9bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco_20201211_134730-cb8055c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco_20201212_111651-441e9d9f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco_20201212_101110-5c575fa5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco_20201212_100710-4ad151de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_2x_coco/faster_rcnn_hrnetv2p_w40_2x_coco_20200512_161033-0f236ef4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_1x_coco/faster_rcnn_hrnetv2p_w40_1x_coco_20200210-95c1f5ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_2x_coco/faster_rcnn_hrnetv2p_w32_2x_coco_20200529_015927-976a9c15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_1x_coco/faster_rcnn_hrnetv2p_w32_1x_coco_20200130-6e286425.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_2x_coco/faster_rcnn_hrnetv2p_w18_2x_coco_20200702_085731-a4ec0611.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_1x_coco/faster_rcnn_hrnetv2p_w18_1x_coco_20200130-56651a6d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w40_20e_coco/cascade_rcnn_hrnetv2p_w40_20e_coco_20200512_161112-75e47b04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w32_20e_coco/cascade_rcnn_hrnetv2p_w32_20e_coco_20200208-928455a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w18_20e_coco/cascade_rcnn_hrnetv2p_w18_20e_coco_20200210-434be9d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w40_20e_coco/cascade_mask_rcnn_hrnetv2p_w40_20e_coco_20200527_204922-969c4610.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w32_20e_coco/cascade_mask_rcnn_hrnetv2p_w32_20e_coco_20200512_154043-39d9cf7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w18_20e_coco/cascade_mask_rcnn_hrnetv2p_w18_20e_coco_20200210-b543cd2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco_20200312-946fd751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_16x1_20e_coco/htc_x101_64x4d_fpn_16x1_20e_coco_20200318-b181fd7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_32x4d_fpn_16x1_20e_coco/htc_x101_32x4d_fpn_16x1_20e_coco_20200318-de97ae01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_20e_coco/htc_r50_fpn_20e_coco_20200319-fe28c577.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_1x_coco/htc_r50_fpn_1x_coco_20200317-7332cf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r101_fpn_20e_coco/htc_r101_fpn_20e_coco_20200317-9b41b48f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/htc/metafile.yml | https://arxiv.org/abs/1901.07518 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco_20200515_080947-8ed58c1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r50_fpn_instaboost_4x_coco/mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-d025f83a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r101_fpn_instaboost_4x_coco/mask_rcnn_r101_fpn_instaboost_4x_coco_20200703_235738-f23f3a5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-c19d98d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/instaboost/metafile.yml | https://arxiv.org/abs/1908.07801 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/lad/lad_r101_paa_r50_fpn_coco_1x.py | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/lad/lad_r50_paa_r101_fpn_coco_1x.py | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/lad/metafile.yml | https://arxiv.org/abs/2108.10520 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ld/ld_r101_gflv1_r101dcn_fpn_coco_2x.py | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ld/ld_r18_gflv1_r101_fpn_coco_1x.py | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ld/metafile.yml | https://arxiv.org/abs/2102.12252 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_retinanet_r50_fpn_1x_coco/libra_retinanet_r50_fpn_1x_coco_20200205-804d94ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_x101_64x4d_fpn_1x_coco/libra_faster_rcnn_x101_64x4d_fpn_1x_coco_20200315-3a7d0488.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r50_fpn_1x_coco/libra_faster_rcnn_r50_fpn_1x_coco_20200130-3afee3a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r101_fpn_1x_coco/libra_faster_rcnn_r101_fpn_1x_coco_20200203-8dba6a5a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/libra_rcnn/metafile.yml | https://arxiv.org/abs/1904.02701 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco_20210526_120447-c376f129.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_2x_coco/mask_rcnn_x101_64x4d_fpn_2x_coco_20200509_224208-39d6f70c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_1x_coco/mask_rcnn_x101_64x4d_fpn_1x_coco_20200201-9352eb0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco_20210607_161042-8bd2c639.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco_20210524_201410-abcd7859.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_2x_coco/mask_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.422__segm_mAP-0.378_20200506_004702-faef898c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_1x_coco/mask_rcnn_x101_32x4d_fpn_1x_coco_20200205-478d0b67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_fpn_mstrain-poly_3x_coco_20210524_201154-21b550bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco_bbox_mAP-0.403__segm_mAP-0.365_20200504_231822-a75c98ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco/mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.38__segm_mAP-0.344_20200504_231812-0ebd1859.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_fpn_mstrain-poly_3x_coco_20210524_200244-5675c317.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_2x_coco/mask_rcnn_r101_fpn_2x_coco_bbox_mAP-0.408__segm_mAP-0.366_20200505_071027-14b391c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_1x_coco/mask_rcnn_r101_fpn_1x_coco_20200204-1efe0ed5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco_20210526_132339-3c33ce02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_1x_coco/mask_rcnn_r101_caffe_fpn_1x_coco_20200601_095758-805e06c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_1x_coco/mask_rcnn_r50_fpn_fp16_1x_coco_20200205-59faf7e4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask_rcnn/metafile.yml | https://arxiv.org/abs/1703.06870v3 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic_20220326_224553-fc567107.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco_20220508_091649-4a943037.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic_20220329_225200-c7b94355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco_20220504_001756-743b7d99.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic_20220407_104949-d4919c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic_20220329_230021-3bb8b482.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic_20220331_002244-c149a9e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r50_lsj_8x2_50e_coco-panoptic/mask2former_r50_lsj_8x2_50e_coco-panoptic_20220326_224516-11a44721.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r50_lsj_8x2_50e_coco/mask2former_r50_lsj_8x2_50e_coco_20220506_191028-8e96e88b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r101_lsj_8x2_50e_coco-panoptic/mask2former_r101_lsj_8x2_50e_coco-panoptic_20220329_225104-c54e64c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r101_lsj_8x2_50e_coco/mask2former_r101_lsj_8x2_50e_coco_20220426_100250-c50b6fa6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/mask2former/metafile.yml | https://arxiv.org/pdf/2112.01527 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/maskformer/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/maskformer/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco_20220326_221612-061b4eb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/maskformer/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/maskformer/maskformer_r50_mstrain_16x1_75e_coco/maskformer_r50_mstrain_16x1_75e_coco_20220221_141956-bc2699cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/maskformer/metafile.yml | https://arxiv.org/pdf/2107.06278 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_2x_coco/ms_rcnn_x101_64x4d_fpn_2x_coco_20200308-02a445e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_1x_coco/ms_rcnn_x101_64x4d_fpn_1x_coco_20200206-86ba88d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_32x4d_fpn_1x_coco/ms_rcnn_x101_32x4d_fpn_1x_coco_20200206-81fd1740.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_2x_coco/ms_rcnn_r50_caffe_fpn_2x_coco_bbox_mAP-0.388__segm_mAP-0.363_20200506_004738-ee87b137.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_1x_coco/ms_rcnn_r50_caffe_fpn_1x_coco_20200702_180848-61c9355e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_2x_coco/ms_rcnn_r101_caffe_fpn_2x_coco_bbox_mAP-0.411__segm_mAP-0.381_20200506_011134-5f3cc74f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_1x_coco/ms_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.404__segm_mAP-0.376_20200506_004755-b9b12a37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ms_rcnn/metafile.yml | https://arxiv.org/abs/1903.00241 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200520-1bdba3ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200521-7fdcbce0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/nas_fcos/metafile.yml | https://arxiv.org/abs/1906.04423 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_nasfpn_crop640_50e_coco/retinanet_r50_nasfpn_crop640_50e_coco-0ad1f644.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_fpn_crop640_50e_coco/retinanet_r50_fpn_crop640_50e_coco-9b953d76.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/nas_fpn/metafile.yml | https://arxiv.org/abs/1904.07392 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/ssd300_32x8_36e_openimages/ssd300_32x8_36e_openimages_20211224_000232-dce93846.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/retinanet_r50_fpn_32x2_1x_openimages/retinanet_r50_fpn_32x2_1x_openimages_20211223_071954-d2ae5462.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_challenge/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_challenge_20220221_192021-34c402d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_20220306_202424-98c630e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_1x_openimages_challenge/faster_rcnn_r50_fpn_32x2_1x_openimages_challenge_20220114_045100-0e79e5df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_1x_openimages/faster_rcnn_r50_fpn_32x2_1x_openimages_20211130_231159-e87ab7ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_mstrain_3x_coco/paa_r50_fpn_mstrain_3x_coco_20210121_145722-06a6880b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_2x_coco/paa_r50_fpn_2x_coco_20200821-c98bfc4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1.5x_coco/paa_r50_fpn_1.5x_coco_20200823-805d6078.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_mstrain_3x_coco/paa_r101_fpn_mstrain_3x_coco_20210122_084202-83250d22.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_2x_coco/paa_r101_fpn_2x_coco_20200821-6829f96b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/paa/metafile.yml | https://arxiv.org/abs/2007.08103 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pafpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pafpn/faster_rcnn_r50_pafpn_1x_coco/faster_rcnn_r50_pafpn_1x_coco_bbox_mAP-0.375_20200503_105836-b7b4b9bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pafpn/metafile.yml | https://arxiv.org/abs/1803.01534 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r50_fpn_mstrain_3x_coco/panoptic_fpn_r50_fpn_mstrain_3x_coco_20210824_171155-5650f98b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r50_fpn_1x_coco/panoptic_fpn_r50_fpn_1x_coco_20210821_101153-9668fd13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r101_fpn_mstrain_3x_coco/panoptic_fpn_r101_fpn_mstrain_3x_coco_20210823_114712-9c99acc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r101_fpn_1x_coco/panoptic_fpn_r101_fpn_1x_coco_20210820_193950-ab9157a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://arxiv.org/pdf/1901.02446 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd512_coco/pisa_ssd512_coco-247addee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd300_coco/pisa_ssd300_coco-710e3ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_x101_32x4d_fpn_1x_coco/pisa_retinanet_x101_32x4d_fpn_1x_coco-a0c13c73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_r50_fpn_1x_coco/pisa_retinanet_r50_fpn_1x_coco-76409952.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_mask_rcnn_r50_fpn_1x_coco/pisa_mask_rcnn_r50_fpn_1x_coco-dfcedba6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco-e4accec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_r50_fpn_1x_coco/pisa_faster_rcnn_r50_fpn_1x_coco-dea93523.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pisa/metafile.yml | https://arxiv.org/abs/1904.04821 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_3x_coco/point_rend_r50_caffe_fpn_mstrain_3x_coco-e0ebb6b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_1x_coco/point_rend_r50_caffe_fpn_mstrain_1x_coco-1bcb5fb4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/point_rend/metafile.yml | https://arxiv.org/abs/1912.08193 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b5_fpn_1x_coco/retinanet_pvtv2-b5_fpn_1x_coco_20210902_201800-3420eb57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b4_fpn_1x_coco/retinanet_pvtv2-b4_fpn_1x_coco_20210901_170151-83795c86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b3_fpn_1x_coco/retinanet_pvtv2-b3_fpn_1x_coco_20210903_151512-8357deff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b2_fpn_1x_coco/retinanet_pvtv2-b2_fpn_1x_coco_20210901_174843-529f0b9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b1_fpn_1x_coco/retinanet_pvtv2-b1_fpn_1x_coco_20210831_103318-7e169a7d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b0_fpn_1x_coco/retinanet_pvtv2-b0_fpn_1x_coco_20210831_103157-13e9aabe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-t_fpn_1x_coco/retinanet_pvt-t_fpn_1x_coco_20210831_103110-17b566bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-s_fpn_1x_coco/retinanet_pvt-s_fpn_1x_coco_20210906_142921-b6c94a5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-m_fpn_1x_coco/retinanet_pvt-m_fpn_1x_coco_20210831_103243-55effa1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_mstrain_480-800_3x_coco/queryinst_r50_fpn_mstrain_480-800_3x_coco_20210901_103643-7837af86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/queryinst_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20210904_101802-85cffbd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_1x_coco/queryinst_r50_fpn_1x_coco_20210907_084916-5a8f1998.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r101_fpn_mstrain_480-800_3x_coco/queryinst_r101_fpn_mstrain_480-800_3x_coco_20210904_104048-91f9995b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/queryinst_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20210904_153621-76cce59f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/queryinst/metafile.yml | https://openaccess.thecvf.com/content/ICCV2021/papers/Fang_Instances_As_Queries_ICCV_2021_paper.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-800MF_fpn_1x_coco/retinanet_regnetx-800MF_fpn_1x_coco_20200517_191403-f6f91d10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-3.2GF_fpn_1x_coco/retinanet_regnetx-3.2GF_fpn_1x_coco_20200520_163141-cb1509e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-1.6GF_fpn_1x_coco/retinanet_regnetx-1.6GF_fpn_1x_coco_20200517_191403-37009a9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-8GF_fpn_1x_coco/mask_rcnn_regnetx-8GF_fpn_1x_coco_20200517_180515-09daa87e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-800MF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-800MF_fpn_mstrain-poly_3x_coco_20210602_210641-715d51f5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco/mask_rcnn_regnetx-6.4GF_fpn_1x_coco_20200517_180439-3a7aae83.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-4GF_fpn_mstrain-poly_3x_coco_20210602_032621-00f0331c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_1x_coco/mask_rcnn_regnetx-4GF_fpn_1x_coco_20200517_180217-32e9c92d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-400MF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-400MF_fpn_mstrain-poly_3x_coco_20210601_235443-8aac57a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco_20200521_202221-99879813.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco_20200520_172726-75f40794.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_1x_coco_20200520_163141-2a9d1814.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-12GF_fpn_1x_coco/mask_rcnn_regnetx-12GF_fpn_1x_coco_20200517_180552-b538bd8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-1.6GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-1_20210602_210641-6e63e19c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-1.6GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-1_20210602_210641-6764cff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-800MF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-800MF_fpn_mstrain_3x_coco_20210526_095118-a2c70b20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-4GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-4GF_fpn_mstrain_3x_coco_20210526_095201-65eaf841.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-400MF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-400MF_fpn_mstrain_3x_coco_20210526_095112-e1967c37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-3_20210526_095152-e16a5227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_2x_coco/faster_rcnn_regnetx-3.2GF_fpn_2x_coco_20200520_223955-e2081918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_1x_coco/faster_rcnn_regnetx-3.2GF_fpn_1x_coco_20200517_175927-126fd9bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-1.6GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-1_20210526_095325-94aa46cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-800MF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-800MF_fpn_mstrain_3x_coco_20210715_211616-dcbd13f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-4GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-4GF_fpn_mstrain_3x_coco_20210715_212034-cbb1be4c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-400MF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-400MF_fpn_mstrain_3x_coco_20210715_211619-5142f449.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-3_20210715_211616-b9c2c58b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-1.6GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-1_20210715_211616-75f29a61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-f87da1ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco_20200329-91babaa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco_20200329-4b38409a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_1x_coco/reppoints_moment_r50_fpn_1x_coco_20200330-b73db8d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco_20200329-4fbc7310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-3309fbf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco_20200329-c98bfa96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_center_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_center_fpn_gn-neck%2Bhead_1x_coco_20200330-00f73d58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/reppoints/metafile.yml | https://arxiv.org/abs/1904.11490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/mask_rcnn_r2_101_fpn_2x_coco/mask_rcnn_r2_101_fpn_2x_coco-17f061e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/htc_r2_101_fpn_20e_coco/htc_r2_101_fpn_20e_coco-3a8d2112.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/faster_rcnn_r2_101_fpn_2x_coco/faster_rcnn_r2_101_fpn_2x_coco-175f1da6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_rcnn_r2_101_fpn_20e_coco/cascade_rcnn_r2_101_fpn_20e_coco-f4b7b7db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_mask_rcnn_r2_101_fpn_20e_coco/cascade_mask_rcnn_r2_101_fpn_20e_coco-8a7b41e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20200926_125503-8a2c3d47.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_215831-af60cdf9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20200926_125502-20289c16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201006_021058-421517f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201122_213640-763cc7b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201005_113242-b9459f8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201122_104428-99eca4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_113243-42607475.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco/retinanet_r50_fpn_rsb-pretrain_1x_coco_20220113_175432-bd24aae9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_174054-06ce8ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_162229-32ae82a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_193636-8b9ad50f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_mstrain_3x_coco/retinanet_x101_64x4d_fpn_mstrain_3x_coco_20210719_051838-022c2187.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_2x_coco/retinanet_x101_64x4d_fpn_2x_coco_20200131-bca068ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_1x_coco/retinanet_x101_64x4d_fpn_1x_coco_20200130-366f5af1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_2x_coco/retinanet_x101_32x4d_fpn_2x_coco_20200131-237fc5e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_1x_coco/retinanet_x101_32x4d_fpn_1x_coco_20200130-5c8b7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_mstrain_3x_coco/retinanet_r50_fpn_mstrain_3x_coco_20210718_220633-88476508.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_2x_coco/retinanet_r50_fpn_2x_coco_20200131-fdb43119.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_1x_coco/retinanet_r50_fpn_1x_coco_20200130-c2398f9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_caffe_fpn_1x_coco/retinanet_r50_caffe_fpn_1x_coco_20200531-f11027c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x8_1x_coco/retinanet_r18_fpn_1x8_1x_coco_20220407_171255-4ea310d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x_coco/retinanet_r18_fpn_1x_coco_20220407_171055-614fd399.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_mstrain_3x_coco/retinanet_r101_fpn_mstrain_3x_coco_20210720_214650-7ee888e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_2x_coco/retinanet_r101_fpn_2x_coco_20200131-5560aee8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_1x_coco/retinanet_r101_fpn_1x_coco_20200130-7a93545f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_mstrain_3x_coco/retinanet_r101_caffe_fpn_mstrain_3x_coco_20210721_063439-88a8a944.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_1x_coco/retinanet_r101_caffe_fpn_1x_coco_20200531-b428fa0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/retinanet_r50_fpn_fp16_1x_coco/retinanet_r50_fpn_fp16_1x_coco_20200702-0dbfb212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/retinanet/metafile.yml | https://arxiv.org/abs/1708.02002 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://arxiv.org/abs/1912.04260 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_gn_1x_coco/sabl_retinanet_r50_fpn_gn_1x_coco-e16dfcf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_1x_coco/sabl_retinanet_r50_fpn_1x_coco-6c54fd4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco-1e63382c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco-5342f857.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_1x_coco/sabl_retinanet_r101_fpn_gn_1x_coco-40a893e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_1x_coco/sabl_retinanet_r101_fpn_1x_coco-42026904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r50_fpn_1x_coco/sabl_faster_rcnn_r50_fpn_1x_coco-e867595b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r101_fpn_1x_coco/sabl_faster_rcnn_r101_fpn_1x_coco-f804c6c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r50_fpn_1x_coco/sabl_cascade_rcnn_r50_fpn_1x_coco-e1748e5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r101_fpn_1x_coco/sabl_cascade_rcnn_r101_fpn_1x_coco-2b83e87c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_x101_64x4d_fpn_20e_coco/scnet_x101_64x4d_fpn_20e_coco-fb09dec9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r50_fpn_20e_coco/scnet_r50_fpn_20e_coco-a569f645.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r50_fpn_1x_coco/scnet_r50_fpn_1x_coco-c3f09857.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r101_fpn_20e_coco/scnet_r101_fpn_20e_coco-294e312c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scnet/metafile.yml | https://arxiv.org/abs/2012.10150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scratch/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scratch/mask_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_mask_rcnn_r50_fpn_gn_6x_bbox_mAP-0.412__segm_mAP-0.374_20200201_193051-1e190a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scratch/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scratch/faster_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_faster_rcnn_r50_fpn_gn_6x_bbox_mAP-0.407_20200201_193013-90813d01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/scratch/metafile.yml | https://arxiv.org/abs/1811.08883 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-cd0f6a12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-392a804b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-a1c11314.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-a698dd3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-1d817139.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-e68eb464.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-a0b59c42.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-8e6e6dd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-c8551505.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-5d8ca2a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-8b5a6745.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-71e2215e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/seesaw_loss/metafile.yml | https://arxiv.org/abs/2008.10032 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_90k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_90k_coco_20220316_181307-6bc5726f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco_20220324_201229-80ee90b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_90k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_90k_coco_20220316_181409-f79c84c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco_20220324_182940-33a100c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://arxiv.org/abs/2012.07177 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/solo_r50_fpn_3x_coco/solo_r50_fpn_3x_coco_20210901_012353-11d224d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/solo_r50_fpn_1x_coco/solo_r50_fpn_1x_coco_20210821_035055-2290a6b8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_r50_fpn_3x_coco/decoupled_solo_r50_fpn_3x_coco_20210821_042504-7b3301ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_r50_fpn_1x_coco/decoupled_solo_r50_fpn_1x_coco_20210820_233348-6337c589.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_light_r50_fpn_3x_coco/decoupled_solo_light_r50_fpn_3x_coco_20210906_142703-e70e226f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_x101_dcn_fpn_3x_coco/solov2_x101_dcn_fpn_3x_coco_20220513_214337-aef41095.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r50_fpn_3x_coco/solov2_r50_fpn_3x_coco_20220512_125856-fed092d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r50_fpn_1x_coco/solov2_r50_fpn_1x_coco_20220512_125858-a357fa23.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r101_fpn_3x_coco/solov2_r101_fpn_3x_coco_20220511_095119-c559a076.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r101_dcn_fpn_3x_coco/solov2_r101_dcn_fpn_3x_coco_20220513_214734-16c966cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r50_fpn_3x_coco/solov2_light_r50_fpn_3x_coco_20220512_165256-c93a6074.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r34_fpn_3x_coco/solov2_light_r34_fpn_3x_coco_20220511_091839-e51659d3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r18_fpn_3x_coco/solov2_light_r18_fpn_3x_coco_20220511_083717-75fa355b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco_20201218_154234-7bc5c054.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_024605-9fe92701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_1x_coco/sparse_rcnn_r50_fpn_1x_coco_20201222_214453-dc79b137.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco_20201223_121552-6c46c9d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_023452-c23c3564.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://arxiv.org/abs/2011.12450 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssdlite_mobilenetv2_scratch_600e_coco/ssdlite_mobilenetv2_scratch_600e_coco_20210629_110627-974d9307.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssd512_coco/ssd512_coco_20210803_022849-0a47a1ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssd300_coco/ssd300_coco_20210803_015428-d231a06e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/ssd/metafile.yml | https://arxiv.org/abs/1512.02325 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco_20210906_131725-bacf6f7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_fp16_ms-crop-3x_coco/mask_rcnn_swin-t-p4-w7_fpn_fp16_ms-crop-3x_coco_20210908_165006-90a4008c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_1x_coco/mask_rcnn_swin-t-p4-w7_fpn_1x_coco_20210902_120937-9d6b7cfa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco_20210903_104808-b92c91f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_x101_64x4d_fpn_mstrain_2x_coco/tood_x101_64x4d_fpn_mstrain_2x_coco_20211211_003519-a4f36113.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_mstrain_2x_coco/tood_r50_fpn_mstrain_2x_coco_20211210_144231-3b23174c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_anchor_based_1x_coco/tood_r50_fpn_anchor_based_1x_coco_20211214_100105-b776c134.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_1x_coco/tood_r50_fpn_1x_coco_20211210_103425-20e20746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r101_fpn_mstrain_2x_coco/tood_r101_fpn_mstrain_2x_coco_20211210_144232-a18f53c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r101_fpn_dconv_c3-c5_mstrain_2x_coco/tood_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20211210_213728-4a824142.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tood/metafile.yml | https://arxiv.org/abs/2108.07755 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_3x_coco/tridentnet_r50_caffe_mstrain_3x_coco_20201130_100539-46d227ba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_1x_coco/tridentnet_r50_caffe_mstrain_1x_coco_20201230_141839-6ce55ccb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_1x_coco/tridentnet_r50_caffe_1x_coco_20201230_141838-2ec0b530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/tridentnet/metafile.yml | https://arxiv.org/abs/1901.01892 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-b5f6da5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-d300a6fc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mstrain_2x_coco/vfnet_r50_fpn_mstrain_2x_coco_20201027-7cc75bd2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-6879c318.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_1x_coco/vfnet_r50_fpn_1x_coco_20201027-38db6f58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mstrain_2x_coco/vfnet_r101_fpn_mstrain_2x_coco_20201027pth-4a5d53f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-7729adb5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_1x_coco/vfnet_r101_fpn_1x_coco_20201027pth-c831ece7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/vfnet/metafile.yml | https://arxiv.org/abs/2008.13367 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r50_8x8_coco/yolact_r50_8x8_coco_20200908-ca34f5db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r50_1x8_coco/yolact_r50_1x8_coco_20200908-f38d58df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r101_1x8_coco/yolact_r101_1x8_coco_20200908-4cbe9101.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolact/metafile.yml | https://arxiv.org/abs/1904.02689 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_mobilenetv2_mstrain-416_300e_coco/yolov3_mobilenetv2_mstrain-416_300e_coco_20210718_010823-f68a07b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_mobilenetv2_320_300e_coco/yolov3_mobilenetv2_320_300e_coco_20210719_215349-d18dff72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-608_273e_coco/yolov3_d53_mstrain-608_273e_coco_20210518_115020-a2c3acb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-416_273e_coco/yolov3_d53_mstrain-416_273e_coco-2b60fcd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_fp16_mstrain-608_273e_coco/yolov3_d53_fp16_mstrain-608_273e_coco_20210517_213542-4bc34944.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_320_273e_coco/yolov3_d53_320_273e_coco-421362b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolo/metafile.yml | https://arxiv.org/abs/1804.02767 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolof/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427-8e864411.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolof/metafile.yml | https://arxiv.org/abs/2103.09460 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_x_8x8_300e_coco/yolox_x_8x8_300e_coco_20211126_140254-1ef88d67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_tiny_8x8_300e_coco/yolox_tiny_8x8_300e_coco_20211124_171234-b4047906.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_s_8x8_300e_coco/yolox_s_8x8_300e_coco_20211121_095711-4592a793.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_l_8x8_300e_coco/yolox_l_8x8_300e_coco_20211126_140236-d3bd2b23.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/configs/yolox/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/mmdet/models/detectors/two_stage.py | https://mmdetection.readthedocs.io/en/latest/tutorials/pytorch2onnx.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/test2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | https://s3-us-west-2.amazonaws.com/dl.fbaipublicfiles.com/LVIS/lvis_v1_train.json.zip | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | https://s3-us-west-2.amazonaws.com/dl.fbaipublicfiles.com/LVIS/lvis_v1_train.json.zip | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | https://download.openmmlab.com/mmocr/data/font.TTF | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/SSD_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCdevkit_08-Jun-2007.tar | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/public_address_statement.md index fcd1544f912494fc13e6748789034c25b142e255..89a8ee4401dfe7e68c1b85d2380093a4ff285bf2 100644 --- a/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/public_address_statement.md @@ -1,14 +1,10 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------|-------------------------------------------------|----------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2014.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2014.sh | https://drive.google.com/uc?export=download | 下载标签 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2014.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2014.sh | https://drive.google.com/uc?export=download | 下载标签 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2014.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2014.sh | http://images.cocodataset.org/zips | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2014.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2014.sh | http://images.cocodataset.org/zips | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2017.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2017.sh | https://drive.google.com/uc?export=download | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2017.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2017.sh | https://drive.google.com/uc?export=download | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2017.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2017.sh | http://images.cocodataset.org/zips | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/data/get_coco2017.sh | YOLOV4_ID0396_for_PyTorch/data/get_coco2017.sh | http://images.cocodataset.org/zips | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/utils/gcp.sh | YOLOV4_ID0396_for_PyTorch/utils/gcp.sh | https://github.com/ultralytics/yolov3 | 开源代码链接 | -| 开发引入 | / | url.ini | https://storage.googleapis.com | 下载标签文件 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/blob/master/utils/utils.py | YOLOV4_ID0396_for_PyTorch/utils/utils.py | https://storage.googleapis.com | 下载标签文件 | -| 开发引入 | / | constant.py | 127.0.0.1 | 本机IP地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|-------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/data/get_coco2014.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/data/get_coco2014.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/data/get_coco2017.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/data/get_coco2017.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/models/models.py | https://pjreddie.com/media/files/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/url.ini | https://storage.googleapis.com | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_ID0396_for_PyTorch/utils/utils.py | https://storage.googleapis.com/%s/results%g.txt | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/public_address_statement.md index c3dfc73cf3829b89817c3a30d45ebf38953d5698..c9c01a4960584253b654ca2723ac51cb43a9acdc 100644 --- a/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/public_address_statement.md @@ -1,81 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2014.sh|https://drive.google.com/uc?export=download&id=1s6-CmF5_SElM28r52P1OUrCcuXZN-SFo|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2014.sh|https://drive.google.com/uc?export=download&confirm=`awk '/download/ {print $NF}' ./cookie`&id=1s6-CmF5_SElM28r52P1OUrCcuXZN-SFo|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2014.sh|http://images.cocodataset.org/zips/train2014.zip|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2014.sh|http://images.cocodataset.org/zips/val2014.zip|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2017.sh|https://drive.google.com/uc?export=download&id=1cXZR_ckHki6nddOmcysCuuJFM--T-Q6L|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2017.sh|https://drive.google.com/uc?export=download&confirm=`awk '/download/ {print $NF}' ./cookie`&id=1cXZR_ckHki6nddOmcysCuuJFM--T-Q6L|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2017.sh|http://images.cocodataset.org/zips/train2017.zip|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|data/get_coco2017.sh|http://images.cocodataset.org/zips/val2017.zip|下载数据集| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|utils/gcp.sh|https://github.com/ultralytics/yolov3|下载依赖文件| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|utils/google_utils.py|https://github.com/WongKinYiu/ScaledYOLOv4/releases/download/v1.0/|下载权重文件| -|开源代码引入|https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master|utils/utils.py|https://storage.googleapis.com/%s/results%g.txt|下载结果文件| -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/ultralytics/yolov3/issues/1139 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/layers.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/train.py | YOLOV4_eb5f166_for_PyTorch/train.py | https://github.com/ultralytics/yolov5/pull/1120 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/ultralytics/yolov3/issues/238#issuecomment-598028441 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/models.py | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://github.com/ultralytics/yolov3/issues/931 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/layers.py | https://arxiv.org/pdf/1905.02244.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/general.py | https://tech.amikelive.com/node-718/what-object-categories-labels-are-in-coco-dataset/ | 数据集地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/models.py | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://github.com/ultralytics/yolov3/issues/441 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/general.py | YOLOV4_eb5f166_for_PyTorch/utils/general.py | https://arxiv.org/abs/2101.08158 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/ultralytics/yolov3/issues/898 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/metrics.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/plots.py | https://github.com/ultralytics/yolov3/issues/168 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/loss.py | https://github.com/ultralytics/yolov3/issues/238#issuecomment-598028441 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/tensorflow/addons/blob/v0.7.1/tensorflow_addons/losses/focal_loss.py | 源码实现 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/utils/gcp.sh | https://github.com/NVIDIA/apex | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/models.py | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://pytorch.org/docs/stable/torchvision/models.html#classification | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/general.py | https://github.com/pytorch/vision/blob/master/torchvision/ops/boxes.py | 源码实现 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/utils/datasets.py | http://wmccpinetop.axiscam.net/mjpg/video.mjpg | 图片地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/adabound.py | YOLOV4_eb5f166_for_PyTorch/utils/adabound.py | https://openreview.net/forum?id=Bkg3g2R9FX | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/models.py | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://drive.google.com/open?id=1LezFG5g3BCW6iYaV89B2i64cqEUZD7e0 | 下载链接 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/loss.py | https://github.com/tensorflow/addons/blob/v0.7.1/tensorflow_addons/losses/focal_loss.py | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/models.py | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://arxiv.org/abs/1911.09516 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/test.py | YOLOV4_eb5f166_for_PyTorch/test.py | https://github.com/cocodataset/cocoapi/blob/master/PythonAPI/pycocoEvalDemo.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/google_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/google_utils.py | https://cloud.google.com/storage/docs/reference/libraries | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/torch_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/torch_utils.py | https://tehnokv.com/posts/fusing-batchnorm-and-conv/ | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/train.py | YOLOV4_eb5f166_for_PyTorch/train.py | https://pytorch.org/docs/stable/_modules/torch/optim/lr_scheduler.html#OneCycleLR | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/train.py | YOLOV4_eb5f166_for_PyTorch/train.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/datasets.py | YOLOV4_eb5f166_for_PyTorch/utils/datasets.py | https://arxiv.org/pdf/1710.09412.pdf | 论文地址 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/utils/parse_config.py | https://github.com/ultralytics/yolov3/issues/631 | 相关说明 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://pjreddie.com/media/files/ | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/torch_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/torch_utils.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/metrics.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://arxiv.org/pdf/1902.04103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/google_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/google_utils.py | https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://arxiv.org/abs/1911.08287v1 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/datasets.py | YOLOV4_eb5f166_for_PyTorch/utils/datasets.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/layers.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://arxiv.org/pdf/1902.09630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/plots.py | YOLOV4_eb5f166_for_PyTorch/utils/plots.py | https://stackoverflow.com/questions/51350872/python-from-color-name-to-rgb | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/test.py | mAP@0.5 | 邮箱地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/general.py | https://arxiv.org/abs/1911.08287v1 | 论文地址 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/utils/google_utils.py | https://drive.google.com/uc?export=download&id=%s | 下载链接 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/ultralytics/yolov3#training | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/layers.py | https://arxiv.org/abs/1911.09070 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/general.py | https://github.com/Zzh-tju/DIoU-SSD-pytorch/blob/master/utils/box/box_utils.py#L47 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/datasets.py | YOLOV4_eb5f166_for_PyTorch/utils/datasets.py | https://github.com/ultralytics/yolov3/issues/232 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/google_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/google_utils.py | https://cloud.google.com/storage/docs/gsutil/commands/du | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/activations.py | https://arxiv.org/pdf/1905.02244.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/pytorch/vision/blob/master/torchvision/ops/boxes.py | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/export.py | YOLOV4_eb5f166_for_PyTorch/models/export.py | https://github.com/lutzroeder/netron | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/datasets.py | YOLOV4_eb5f166_for_PyTorch/utils/datasets.py | https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/torch_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/torch_utils.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/models/models.py | YOLOV4_eb5f166_for_PyTorch/models/models.py | https://github.com/AlexeyAB/darknet/issues/2914#issuecomment-496675346 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/torch_utils.py | YOLOV4_eb5f166_for_PyTorch/utils/torch_utils.py | https://pytorch.org/docs/stable/notes/randomness.html | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://tech.amikelive.com/node-718/what-object-categories-labels-are-in-coco-dataset/ | 相关说明 | -| 开发引入 | / | YOLOV4_eb5f166_for_PyTorch/utils/loss.py | https://arxiv.org/pdf/1902.04103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/layers.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/plots.py | YOLOV4_eb5f166_for_PyTorch/utils/plots.py | https://stackoverflow.com/questions/28536191/how-to-filter-smooth-with-scipy-numpy | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/ultralytics/yolov3/wiki/Train-Custom-Data | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/general.py | https://arxiv.org/pdf/1902.09630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/activations.py | YOLOV4_eb5f166_for_PyTorch/utils/activations.py | https://arxiv.org/abs/2007.11824 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/ultralytics/yolov3/issues/168 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/metrics.py | https://github.com/ultralytics/yolov3/issues/898 | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/utils.py | YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://github.com/Zzh-tju/DIoU-SSD-pytorch/blob/master/utils/box/box_utils.py#L47 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/PyTorch_YOLOv4/tree/master/utils/layers.py | YOLOV4_eb5f166_for_PyTorch/utils/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|-------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/data/get_coco2014.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/data/get_coco2014.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/data/get_coco2017.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/data/get_coco2017.sh | http://images.cocodataset.org/zips/$f | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/models/models.py | https://pjreddie.com/media/files/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV4_eb5f166_for_PyTorch/utils/utils.py | https://storage.googleapis.com/%s/results%g.txt | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/public_address_statement.md index 90edf4f3124dcc7130beab9cf26e6ea60b274c37..450f80f143cd55d8fa06b21a003f53b452abbeff 100644 --- a/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/public_address_statement.md @@ -1,53 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------|-------------------------------------------------|----------------------------------------------------------------------|--------| -| 开发引入 | / | YOLOV9_for_PyTorch/utils/loggers/__init__.py | http://localhost:6006/ | 本机IP地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/loggers/wandb/sweep.yaml | YOLOV9_for_PyTorch/utils/loggers/wandb/sweep.yaml | https://docs.wandb.ai/guides/sweeps/configuration | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/scripts/get_coco.sh | YOLOV9_for_PyTorch/scripts/get_coco.sh | http://cocodataset.org | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/scripts/get_coco.sh | YOLOV9_for_PyTorch/scripts/get_coco.sh | http://images.cocodataset.org/zips/ | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/scripts/get_coco.sh | YOLOV9_for_PyTorch/scripts/get_coco.sh | https://github.com/ultralytics/yolov5/releases/download/v1.0 | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/LICENSE.md | YOLOV9_for_PyTorch/LICENSE.md | https://fsf.org/ | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/hubconf.py | YOLOV9_for_PyTorch/hubconf.py | https://ultralytics.com/images/zidane.jpg | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/export.py | YOLOV9_for_PyTorch/export.py | https://pypi.ngc.nvidia.com | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/export.py | YOLOV9_for_PyTorch/export.py | https://coral.ai/docs/edgetpu/compiler/ | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/export.py | YOLOV9_for_PyTorch/export.py | https://packages.cloud.google.com/apt/doc/apt-key.gpg | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/export.py | YOLOV9_for_PyTorch/export.py | https://packages.cloud.google.com/apt | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/export.py | YOLOV9_for_PyTorch/export.py | https://netron.app | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/classify/train.py | YOLOV9_for_PyTorch/classify/train.py | https://github.com/ultralytics/yolov5/releases/download/v1.0 | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/classify/train.py | YOLOV9_for_PyTorch/classify/train.py | https://pytorch.org/vision/stable/models.html | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/classify/train.py | YOLOV9_for_PyTorch/classify/train.py | https://netron.app | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/coco_utils.py | YOLOV9_for_PyTorch/utils/coco_utils.py | https://tech.amikelive.com/node-718/what-object-categories-labels-are-in-coco-dataset/ | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/coco_utils.py | YOLOV9_for_PyTorch/utils/coco_utils.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/dataloaders.py | YOLOV9_for_PyTorch/utils/dataloaders.py | https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/downloads.py | YOLOV9_for_PyTorch/utils/downloads.py | https://api.github.com/repos/ultralytics/yolov5/releases/latest | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/downloads.py | YOLOV9_for_PyTorch/utils/downloads.py | https://github.com/ultralytics/yolov5/releases/download/tags/v7.0/efficientnet_b0.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/downloads.py | YOLOV9_for_PyTorch/utils/downloads.py | https://url.com/file.txt?auth | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/general.py | YOLOV9_for_PyTorch/utils/general.py | https://github.com/WongKinYiu/yolov9 | 开源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/general.py | YOLOV9_for_PyTorch/utils/general.py | https://ultralytics.com/assets/Arial.ttf | 下载字符集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/utils/general.py | YOLOV9_for_PyTorch/utils/general.py | https://url.com/file.txt?auth | 开源代码链接 | -| 开发引入 | - | YOLOV9_for_PyTorch/README.md | https://github.com/WongKinYiu/yolov9.git | 模型源代码链接 | -| 开发引入 | - | YOLOV9_for_PyTorch/README.md | https://ultralytics.com/assets/Arial.ttf | 下载字符集 | -| 开发引入 | - | YOLOV9_for_PyTorch/README.md | https://github.com/ultralytics/yolov5/releases/download/v1.0/coco2017labels-segments.zip | 下载数据集 | -| 开发引入 | - | YOLOV9_for_PyTorch/README.md | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开发引入 | - | YOLOV9_for_PyTorch/README.md | http://images.cocodataset.org/zips/test2017.zip | 下载数据集 | -| 开发引入 | - | YOLOV9_for_PyTorch/README.md | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开发引入 | - | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9.git | 模型源代码链接 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | http://images.cocodataset.org/zips/test2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov7/releases/download/v0.1/coco2017labels-segments.zip | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-t-converted.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-s-converted.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-m-converted.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-c-converted.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-e-converted.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-s.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-m.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-c.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/yolov9-e.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-s.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-m.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-c.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-e.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-c-det.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-c-seg.pt | 下载数据集 | -| 开源代码引入 | https://github.com/WongKinYiu/yolov9/blob/main/README.md | YOLOV9_for_PyTorch/README_en.md | https://github.com/WongKinYiu/yolov9/releases/download/v0.1/gelan-c-pan.pt | 下载数据集 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|-----------------------------------------------|------------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/classify/train.py | https://pytorch.org/vision/stable/models.html | torchvision官网精度 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/classify/train.py | https://netron.app | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/export.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/export.py | https://netron.app | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/export.py | https://netron.app | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/hubconf.py | https://ultralytics.com/images/zidane.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/scripts/get_coco.sh | http://images.cocodataset.org/zips/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/utils/general.py | https://url.com/file.txt?auth | 开源代码链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YOLOV9_for_PyTorch/utils/general.py | https://ultralytics.com/assets/{font.name} | 下载相关配置 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/public_address_statement.md index 66f0ac836b19ec4aeca8629d7301b7dc54f9472b..330eaadc654f81310dc880a2a0fd07b9efc9b330 100644 --- a/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/public_address_statement.md @@ -1,146 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|.pre-commit-config.yaml|https://gitlab.com/pycqa/flake8.git|代码规范检查| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|.pre-commit-config.yaml|https://github.com/asottile/seed-isort-config|代码规范检查| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|.pre-commit-config.yaml|https://github.com/timothycrosley/isort|代码规范检查| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|.pre-commit-config.yaml|https://github.com/pre-commit/mirrors-yapf|代码规范检查| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|.pre-commit-config.yaml|https://github.com/pre-commit/pre-commit-hooks|代码规范检查| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|.pre-commit-config.yaml|https://github.com/myint/docformatter|代码规范检查| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py|https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth|下载权重文件| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py|https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth|下载权重文件| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py|https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth|下载权重文件| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|docker/Dockerfile|https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html|安装三方依赖库| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|docker/Dockerfile|https://github.com/open-mmlab/mmdetection.git|下载源码| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|Dockerfile|https://github.com/open-mmlab/mmcv.git|下载源码| -|开发引入|不涉及|url.ini|https://github.com/open-mmlab/mmcv.git|下载源码| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|setup.py|openmmlab@gmail.com|作者邮箱| -|开源代码引入|https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo|setup.py|https://github.com/open-mmlab/mmdetection|下载源码| -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/transformer_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/docs/stat.py | https://github.com/open-mmlab/mmdetection/blob/master/ | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/yolo/README.md | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开发引入 | / | YoloV3_ID1790_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/detection/YoloV3_ID1790_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/public_address_statement.md b/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/public_address_statement.md index 5e2168d970d733a06b79c35ed6e9e5675c599c09..e6ec1c74b626e90e4671e77f4ea1fe3a645ec62b 100644 --- a/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/public_address_statement.md +++ b/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/public_address_statement.md @@ -1,69 +1,27 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------|-----------------------------------------------------------|--------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/get_test_data.sh | SiamMask_for_Pytorch/data/get_test_data.sh | https://github.com/jvlmdr/trackdat.git | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/get_test_data.sh | SiamMask_for_Pytorch/data/get_test_data.sh | http://www.robots.ox.ac.uk/~qwang/VOT2016.json | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/get_test_data.sh | SiamMask_for_Pytorch/data/get_test_data.sh | http://www.robots.ox.ac.uk/~qwang/VOT2018.json | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/get_test_data.sh | SiamMask_for_Pytorch/data/get_test_data.sh | https://data.vision.ee.ethz.ch/csergi/share/davis/DAVIS-2017-trainval-480p.zip | 下载数据集 | -| 开发引入 | / | url.ini | https://github.com/jvlmdr/trackdat.git | 下载数据集 | -| 开发引入 | / | url.ini | http://www.robots.ox.ac.uk/~qwang/VOT2018.json | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/ytb_vos/download_from_gdrive.py | SiamMask_for_Pytorch/data/ytb_vos/download_from_gdrive.py | https://docs.google.com | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_base/resnet.py | SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_base/resnet.py | SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_base/resnet.py | SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_base/resnet.py | SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_base/resnet.py | SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_sharp/resnet.py | SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_sharp/resnet.py | SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_sharp/resnet.py | SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_sharp/resnet.py | SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siammask_sharp/resnet.py | SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siamrpn_resnet/resnet.py | SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siamrpn_resnet/resnet.py | SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siamrpn_resnet/resnet.py | SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siamrpn_resnet/resnet.py | SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/experiments/siamrpn_resnet/resnet.py | SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pysot/datasets/vot.py | SiamMask_for_Pytorch/utils/pysot/datasets/vot.py | http://www.robots.ox.ac.uk/~qwang/VOT2016.json | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pysot/datasets/vot.py | SiamMask_for_Pytorch/utils/pysot/datasets/vot.py | http://www.robots.ox.ac.uk/~qwang/VOT2018.json | 下载数据集 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/evaluation/eao_benchmark.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/__init__.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/misc.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/coco/pycocotools/_mask.pyx | SiamMask_for_Pytorch/data/coco/pycocotools/common/maskApi.c | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/region.pyx | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/dataset.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/region.pyx | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/coco/pycocotools/_mask.pyx | SiamMask_for_Pytorch/data/coco/pycocotools/common/maskApi.h | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/statistics.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/evaluation/ar_benchmark.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/evaluation/__init__.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/coco/pycocotools/_mask.pyx | SiamMask_for_Pytorch/data/coco/pycocotools/coco.py | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/setup.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/video.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开发引入 | / | SiamMask_for_Pytorch/data/coco/pycocotools/common/gason.cpp | https://github.com/vivkin/gason | 源码实现 | -| 开发引入 | / | SiamMask_for_Pytorch/data/coco/pycocotools/common/gason.h | https://github.com/vivkin/gason | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/coco/pycocotools/_mask.pyx | SiamMask_for_Pytorch/data/coco/pycocotools/cocoeval.py | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/__init__.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/setup.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/tools/eval.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/tools/eval.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/__init__.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/vot.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/evaluation/ar_benchmark.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pyvotkit/setup.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pyvotkit/__init__.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/coco/pycocotools/_mask.pyx | SiamMask_for_Pytorch/data/coco/pycocotools/mask.py | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/data/coco/pycocotools/_mask.pyx | SiamMask_for_Pytorch/data/coco/pycocotools/_mask.pyx | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/dataset.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/vot.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/misc.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/video.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/datasets/__init__.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/utils/statistics.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pyvotkit/setup.py | https://github.com/StrangerZhang/pysot-toolkit.git | 源码实现 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/evaluation/__init__.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pyvotkit/__init__.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/foolwood/SiamMask/blob/master/utils/pyvotkit/__init__.py | SiamMask_for_Pytorch/utils/pysot/evaluation/eao_benchmark.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/data/get_test_data.sh | http://www.robots.ox.ac.uk/~qwang/VOT2018.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/data/get_test_data.sh | http://www.robots.ox.ac.uk/~qwang/VOT2016.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_base/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siammask_sharp/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/experiments/siamrpn_resnet/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/url.ini | http://www.robots.ox.ac.uk/~qwang/VOT2018.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/utils/pysot/datasets/vot.py | http://www.robots.ox.ac.uk/~qwang/VOT2018.json | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/object_tracking/SiamMask_for_Pytorch/utils/pysot/datasets/vot.py | http://www.robots.ox.ac.uk/~qwang/VOT2016.json | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/public_address_statement.md index 82ced5570687f0abb03a03efab392bc01ad6d9cf..f6ea53cd2d1f179728ebb1c67a79f0e0511ee16e 100644 --- a/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/public_address_statement.md @@ -1,495 +1,933 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/300w.py | https://ibug.doc.ic.ac.uk/resources/300-W/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/aflw.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/aflw.py | https://www.tugraz.at/institute/icg/research/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/aic.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/aic.py | https://github.com/AIChallenger/AI_Challenger_2017 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/aic.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/aic.py | https://github.com/AIChallenger/AI_Challenger_2017/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/animalpose.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/animalpose.py | https://sites.google.com/view/animal-pose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/ap10k.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/ap10k.py | https://github.com/AlexTheBad/AP-10K | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/animalpose.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/animalpose.py | https://github.com/cocodataset/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/atrw.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/atrw.py | https://cvwc2019.github.io/challenge.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/campus.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/campus.py | http://campar.in.tum.de/Chair/MultiHumanPose | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/coco.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco.py | http://cocodataset.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco_wholebody.py | https://github.com/jin-s13/COCO-WholeBody/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco_wholebody_face.py | https://github.com/jin-s13/COCO-WholeBody/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco_wholebody_hand.py | https://github.com/jin-s13/COCO-WholeBody/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/cofw.py | http://www.vision.caltech.edu/xpburgos/ICCV13/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/coco_wholebody.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco_wholebody_face.py | https://github.com/jin-s13/COCO-WholeBody/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/crowdpose.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/crowdpose.py | https://github.com/Jeff-sjtu/CrowdPose | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/deepfashion2.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/deepfashion2.py | https://github.com/switchablenorms/DeepFashion2 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/deepfashion_full.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/deepfashion_full.py | http://mmlab.ie.cuhk.edu.hk/projects/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/deepfashion_full.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/deepfashion_lower.py | http://mmlab.ie.cuhk.edu.hk/projects/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/deepfashion_full.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/deepfashion_upper.py | http://mmlab.ie.cuhk.edu.hk/projects/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/fly.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/fly.py | https://github.com/jgraving/DeepPoseKit-Data | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/coco_wholebody.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco_wholebody.py | https://github.com/jin-s13/COCO-WholeBody/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/hand_2d_keypoint/rtmpose/hand5/rtmpose_hand5.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/freihand2d.py | https://lmb.informatik.uni-freiburg.de/projects/freihand/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/h36m.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/h36m.py | http://vision.imar.ro/human3.6m/description.php | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/rtmpose/body8/rtmpose_body8-halpe26.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/halpe.py | https://github.com/Fang-Haoshu/Halpe-FullBody/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/horse10.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/horse10.py | http://www.mackenziemathislab.org/horse10 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/interhand2d.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/interhand2d.py | https://mks0601.github.io/InterHand2.6M/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/interhand2d.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/interhand3d.py | https://mks0601.github.io/InterHand2.6M/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/jhmdb.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/jhmdb.py | http://jhmdb.is.tue.mpg.de/dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/halpe.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/halpe.py | https://github.com/Fang-Haoshu/Halpe-FullBody/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/fly.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/locust.py | https://github.com/jgraving/DeepPoseKit-Data | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/macaque.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/macaque.py | http://www.pri.kyoto-u.ac.jp/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/mhp.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/mhp.py | https://lv-mhp.github.io/dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/mpii.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/mpii.py | http://human-pose.mpi-inf.mpg.de/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/mpii_trb.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/mpii_trb.py | https://github.com/kennymckormick/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/mpi_inf_3dhp.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/mpi_inf_3dhp.py | http://gvv.mpi-inf.mpg.de/3dhp-dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/ochuman.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/ochuman.py | https://github.com/liruilong940607/OCHumanApi | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/hand_2d_keypoint/rtmpose/hand5/rtmpose_hand5.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/onehand10k.py | https://www.yangangwang.com/papers/WANG-MCC-2018-10.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/panoptic_body3d.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/panoptic_body3d.py | http://domedb.perception.cs.cmu.edu | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/hand_2d_keypoint/rtmpose/hand5/rtmpose_hand5.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/rhd2d.py | https://lmb.informatik.uni-freiburg.de/resources/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/panoptic_hand2d.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/panoptic_hand2d.py | http://domedb.perception.cs.cmu.edu/handdb.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/posetrack18.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/posetrack18.py | https://posetrack.net/users/download.php | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/hand_2d_keypoint/rtmpose/hand5/rtmpose_hand5.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/rhd2d.py | https://lmb.informatik.uni-freiburg.de/resources/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/campus.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/shelf.py | http://campar.in.tum.de/Chair/MultiHumanPose | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/wflw.py | https://wywu.github.io/projects/LAB/WFLW.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/fly.py| HRNet_MMPose_for_PyTorch/configs/_base_/datasets/zebra.py | https://github.com/jgraving/DeepPoseKit-Data | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md | HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t16_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md | HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t32_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md | HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t64_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md | HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t8_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_mesh.md| HRNet_MMPose_for_PyTorch/mmpose/core/evaluation/mesh_eval.py | https://github.com/akanazawa/hmr | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/hrnet_fp16_coco.md| HRNet_MMPose_for_PyTorch/mmpose/core/fp16/hooks.py | https://arxiv.org/abs/1710.03740 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/evaluation/functional/nms.py| HRNet_MMPose_for_PyTorch/mmpose/core/post_processing/nms.py | https://github.com/leoxiaobin/deep-high-resolution-net.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/evaluation/functional/nms.py| HRNet_MMPose_for_PyTorch/mmpose/core/post_processing/post_transforms.py | https://github.com/leoxiaobin/deep-high-resolution-net.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/mpi_inf_3dhp.py | HRNet_MMPose_for_PyTorch/mmpose/core/post_processing/one_euro_filter.py | http://gvv.mpi-inf.mpg.de/projects/VNect/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/datasets/transforms/common_transforms.py | HRNet_MMPose_for_PyTorch/mmpose/datasets/pipelines/shared_transform.py | https://albumentations.readthedocs.io | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/transforms/common_transforms.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/pipelines/shared_transform.py | https://github.com/albumentations-team/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/alexnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/alexnet.py | https://en.wikipedia.org/wiki/AlexNet | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/cpm.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/cpm.py | https://arxiv.org/abs/1602.00134 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/hourglass.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/associative_embedding/coco/hrnet_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/hourglass_ae.py | https://arxiv.org/abs/1611.05424 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/hrformer.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/hrformer.py | https://arxiv.org/abs/1907.12273 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/litehrnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/litehrnet.py | https://github.com/HRNet/Lite-HRNet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/hrnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/hrformer.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/hrformer.py | https://arxiv.org/abs/2110.09408 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/litehrnet_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/litehrnet.py | https://arxiv.org/abs/2104.06403 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/litehrnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/litehrnet.py | https://github.com/HRNet/Lite-HRNet | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/regnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/pvt.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/pvt.py | https://github.com/open-mmlab/mmcv/pull/1418 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/pvt.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/pvt.py | https://arxiv.org/pdf/2102.12122.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/resnest.py | https://arxiv.org/pdf/2004.08955.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/pvt.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/pvt.py | https://arxiv.org/pdf/2106.13797.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/resnext.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/resnext.py | https://arxiv.org/abs/1611.05431 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/seresnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/seresnet.py | https://arxiv.org/abs/1709.01507 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/scnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/scnet.py | http://mftp.mmcheng.net/Papers/20cvprSCNet.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/resnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/resnet.py | https://arxiv.org/abs/1512.03385 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/seresnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/seresnext.py | https://arxiv.org/abs/1709.01507 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/resnet.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/swin_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/swin.py | https://arxiv.org/abs/2103.14030 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/swin.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/swin.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/tcn.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/tcn.py | https://arxiv.org/abs/1811.11742 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/v2v_net.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/v2v_net.py | https://github.com/microsoft/voxelpose-pytorch/blob/main/lib/models/v2v_net.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/simcc/coco/vipnas_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/vipnas_mbv3.py | https://arxiv.org/abs/2105.10154 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/v2v_net.py| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/v2v_net.py | https://arxiv.org/abs/1711.07399 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/associative_embedding.py | https://github.com/open-mmlab/mmpose/pull/382 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/simcc/coco/vipnas_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/backbones/vipnas_resnet.py | https://arxiv.org/abs/2105.10154 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/cid.py | https://github.com/open-mmlab/mmpose/pull/382 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/multiview_pose.py | https://github.com/microsoft/voxelpose- | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/posewarper.py | https://arxiv.org/abs/1906.04016 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/multiview_pose.py | https://arxiv.org/abs/2004.06239 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/top_down.py | https://github.com/open-mmlab/mmpose/pull/382 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/models/detectors/multiview_pose.py | https://arxiv.org/abs/2004.06239 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/simcc/coco/vipnas_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/heads/vipnas_heatmap_simple_head.py | https://arxiv.org/abs/2105.10154 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/backbones/v2v_net.py| HRNet_MMPose_for_PyTorch/mmpose/models/heads/voxelpose_head.py | https://github.com/microsoft/voxelpose-pytorch/blob/main/lib/models | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/models/heads/voxelpose_head.py | https://arxiv.org/abs/2004.06239 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/2d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/models/losses/multi_loss_factory.py | https://github.com/HRNet/HigherHRNet-Human-Pose-Estimation | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_regression/coco/mobilenetv2_rle_coco.md| HRNet_MMPose_for_PyTorch/mmpose/models/losses/regression_loss.py | https://arxiv.org/abs/2107.11291 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/losses/regression_loss.py| HRNet_MMPose_for_PyTorch/mmpose/models/losses/regression_loss.py | https://github.com/Jeff-sjtu/res-loglikelihood-regression | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_mesh.md| HRNet_MMPose_for_PyTorch/mmpose/models/misc/discriminator.py | https://github.com/akanazawa/hmr | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/necks/fpn.py| HRNet_MMPose_for_PyTorch/mmpose/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/models/necks/posewarper_neck.py | https://arxiv.org/abs/1906.04016 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/necks/posewarper_neck.py| HRNet_MMPose_for_PyTorch/mmpose/models/necks/posewarper_neck.py | https://github.com/open-mmlab/mmcv/issues/1440 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/utils/realnvp.py| HRNet_MMPose_for_PyTorch/mmpose/models/utils/realnvp.py | https://arxiv.org/abs/1605.08803 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/losses/regression_loss.py| HRNet_MMPose_for_PyTorch/mmpose/models/utils/realnvp.py | https://github.com/Jeff-sjtu/res-loglikelihood-regression | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/src/papers/algorithms/dekr.md| HRNet_MMPose_for_PyTorch/mmpose/models/utils/rescore.py | https://github.com/HRNet/DEKR | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/utils/realnvp.py| HRNet_MMPose_for_PyTorch/mmpose/models/utils/realnvp.py | https://github.com/senya-ashukha/real-nvp-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/models/utils/smpl.py | https://github.com/vchoutas/smplx | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/utils/transformer.py| HRNet_MMPose_for_PyTorch/mmpose/models/utils/transformer.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_mesh.md | HRNet_MMPose_for_PyTorch/mmpose/core/post_processing/temporal_filters/gaussian_filter.py | https://github.com/akanazawa/human_dynamics/blob/mas | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_ap10k_dataset.py | https://arxiv.org/abs/2108.12617 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/core/post_processing/temporal_filters/smoothnet_filter.py | https://arxiv.org/abs/2112.13715 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_ap10k_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/2d_animal_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_atrw_dataset.py | https://arxiv.org/abs/1906.05586 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_atrw_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/animal/fly_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_fly_dataset.py | https://www.biorxiv.org/content/biorxiv/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/core/post_processing/temporal_filters/smoothnet_filter.py | https://arxiv.org/abs/2112.13715 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_fly_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/animal/horse10_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_horse10_dataset.py | https://arxiv.org/pdf/1909.11229.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_horse10_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_locust_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/animal/macaque_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_macaque_dataset.py | https://www.biorxiv.org/content/10.1101/2020.07.30.229989v1 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/animal/animalpose_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_pose_dataset.py | https://arxiv.org/abs/1908.05806 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_macaque_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_pose_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/animal/animal_zebra_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/base/kpt_2d_sview_rgb_img_bottom_up_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/base/kpt_2d_sview_rgb_img_top_down_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/base/kpt_2d_sview_rgb_vid_top_down_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/base/kpt_3d_mview_rgb_img_direct_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/base/kpt_3d_sview_kpt_2d_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/base/kpt_3d_sview_rgb_img_top_down_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body3d/h36m_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_h36m_dataset.py | http://vision.imar.ro/human3.6m/pami-h36m.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_campus_dataset.py | https://github.com/microsoft/voxelpose-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_h36m_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_campus_dataset.py | http://campar.in.tum.de/pub/belagiannis2014cvpr/belagiannis2014cvpr.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mpi_inf_3dhp_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_campus_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_panoptic_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_shelf_dataset.py | https://github.com/microsoft/voxelpose-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/3d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_shelf_dataset.py | http://campar.in.tum.de/pub/belagiannis2014cvpr/belagiannis2014cvpr.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/body3d/body3d_mview_direct_shelf_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/aic/hrnet_aic.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_aic.py | https://arxiv.org/abs/1711.06475 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_aic.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/face/coco_wholebody_face_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_coco_wholebody.py | https://arxiv.org/abs/2007.11858 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/2d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_crowdpose.py | https://arxiv.org/abs/1812.00324 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_coco.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_coco_wholebody.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_crowdpose.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/mhp_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_mhp.py | https://arxiv.org/abs/1804.03287 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/bottom_up/bottom_up_mhp.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_300w_dataset.py | https://ibug.doc.ic.ac.uk/resources/300-W/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/aflw.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_aflw_dataset.py | https://www.tugraz.at/institute/icg/research | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_300w_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_aflw_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/face/coco_wholebody_face_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_coco_wholebody_dataset.py | https://arxiv.org/abs/2007.11858 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_cofw_dataset.py | http://www.vision.caltech.edu/xpburgos/ICCV13/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_coco_wholebody_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_cofw_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/face_2d_keypoint/rtmpose/face6/rtmpose_face6.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_wflw_dataset.py | https://wywu.github.io/projects/LAB/WFLW.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/face/face_wflw_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/fashion/fashion_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/fashion/deepfashion_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/fashion/deepfashion_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/gesture/nvgesture_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/hand_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/hand/freihand_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/freihand_dataset.py | https://arxiv.org/pdf/1909.04349.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/freihand_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/face/coco_wholebody_face_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/hand_coco_wholebody_dataset.py | https://arxiv.org/abs/2007.11858 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/hand/interhand2d_double_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/interhand2d_dataset.py | https://arxiv.org/pdf/2008.09309.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/hand/interhand2d_double_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/interhand3d_dataset.py | https://arxiv.org/pdf/2008.09309.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/hand_coco_wholebody_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/interhand2d_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/interhand3d_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/interhand2d_dataset.py | https://github.com/facebookresearch/InterHand2.6M/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/interhand3d_dataset.py | https://github.com/facebookresearch/InterHand2.6M/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/hand/onehand10k_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/onehand10k_dataset.py | https://www.yangangwang.com/papers/WANG-MCC-2018-10.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/2d_hand_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/panoptic_hand2d_dataset.py | https://arxiv.org/abs/1704.07809 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/onehand10k_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/panoptic_hand2d_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/hand/rhd2d_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/rhd2d_dataset.py | https://arxiv.org/pdf/1705.01389.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/hand/rhd2d_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/aic/hrnet_aic.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_aic_dataset.py | https://arxiv.org/abs/1711.06475 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_base_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_aic_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/coco_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_coco_dataset.py | https://arxiv.org/abs/1405.0312 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/face/coco_wholebody_face_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_coco_wholebody_dataset.py | https://arxiv.org/abs/2007.11858 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/dataset_zoo/2d_body_keypoint.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_crowdpose_dataset.py | https://arxiv.org/abs/1812.00324 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_coco_wholebody_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_coco_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_crowdpose_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body3d/h36m_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_h36m_dataset.py | http://vision.imar.ro/human3.6m/pami-h36m.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_h36m_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/rtmpose/body8/rtmpose_body8-halpe26.md| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_halpe_dataset.py | https://github.com/Fang-Haoshu/Halpe-FullBody | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_halpe_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/jhmdb_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_jhmdb_dataset.py | https://openaccess.thecvf.com/content_iccv_2013/papers/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_jhmdb_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/mhp_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mhp_dataset.py | https://arxiv.org/abs/1804.03287 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/mpii_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mpii_dataset.py | http://human-pose.mpi-inf.mpg.de/contents/andriluka14cvpr.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mhp_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mpii_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/evaluation/functional/nms.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mpii_dataset.py | https://github.com/leoxiaobin/deep-high-resolution-net.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/mpii_trb_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mpii_trb_dataset.py | https://arxiv.org/abs/1910.11535 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/ochuman_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_ochuman_dataset.py | https://arxiv.org/abs/1803.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_mpii_trb_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_ochuman_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/posetrack18_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_posetrack18_dataset.py | https://arxiv.org/abs/1710.10000 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/datasets/body/posetrack18_dataset.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_posetrack18_video_dataset.py | https://arxiv.org/abs/1710.10000 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/datasets/datasets/top_down/topdown_posetrack18_dataset.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_w32_animalpose_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_w48_animalpose_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_w32_ap10k_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_w48_ap10k_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_w32_atrw_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_w48_atrw_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w32_horse10_256x256-split1.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w32_horse10_256x256-split2.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w32_horse10_256x256-split3.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w48_horse10_256x256-split1.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w48_horse10_256x256-split2.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w48_horse10_256x256-split3.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_w32_macaque_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_w48_macaque_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_w32_aic_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_w32_aic_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/hrnet_w32_aic_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w48_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w48_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w48_crowdpose_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/mhp/hrnet_w48_mhp_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w48_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w48_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w32_aic_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w32_aic_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w48_aic_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w48_aic_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_base_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_base_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnetv2_w64_coco_384x288_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_small_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_small_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/aic/hrnet_aic.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_coarsedropout.py | https://download.openmmlab.com/mmpose/top_down/hrnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/aic/hrnet_aic.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_gridmask.py | https://download.openmmlab.com/mmpose/top_down/hrnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/aic/hrnet_aic.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_photometric.py | https://download.openmmlab.com/mmpose/top_down/hrnet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_udp_regress.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_384x288_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_pvt-s_8xb64-210e_coco-256x192.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/pvt-s_coco_256x192.py | https://github.com/whai362/PVT/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_pvt-s_8xb64-210e_coco-256x192.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/pvtv2-b2_coco_256x192.py | https://github.com/whai362/PVT/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet101_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet101_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet50_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet50_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_b_p4_w7_coco_256x192.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_b_p4_w7_coco_384x288.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_b_p4_w7_fpn_coco_256x192.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_b_p4_w7_fpn_coco_384x288.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_l_p4_w7_coco_384x288.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_l_p4_w7_coco_256x192.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_swin-b-p4-w7_8xb32-210e_coco-256x192.py | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_t_p4_w7_coco_256x192.py | https://github.com/SwinTransformer/storage/releases/download | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w32_crowdpose_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w32_crowdpose_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w48_crowdpose_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w48_crowdpose_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_w32_h36m_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_w48_h36m_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/mpii/cpm_mpii.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb_sub1_368x368.py | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/mpii/cpm_mpii.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb_sub2_368x368.py | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/mpii/cpm_mpii.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb_sub3_368x368.py | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/jhmdb/td-hm_res50-2deconv_8xb64-40e_jhmdb-sub1-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_2deconv_jhmdb_sub1_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/jhmdb/td-hm_res50-2deconv_8xb64-40e_jhmdb-sub1-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_2deconv_jhmdb_sub2_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/jhmdb/td-hm_res50-2deconv_8xb64-40e_jhmdb-sub1-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_2deconv_jhmdb_sub3_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/jhmdb/td-hm_res50-2deconv_8xb64-40e_jhmdb-sub1-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_jhmdb_sub1_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/jhmdb/td-hm_res50-2deconv_8xb64-40e_jhmdb-sub1-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_jhmdb_sub2_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/jhmdb/td-hm_res50-2deconv_8xb64-40e_jhmdb-sub1-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_jhmdb_sub3_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w32_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w32_mpii_256x256_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w32_mpii_256x256_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w48_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w48_mpii_256x256_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w48_mpii_256x256_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet101_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet50_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w32_ochuman_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w32_ochuman_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w48_ochuman_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w48_ochuman_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/MMPose_Tutorial.ipynb| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w32_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w32_posetrack18_384x288.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288-d9f0d786_20200708.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/docs/en/2d_human_pose_demo.md| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w48_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w48_posetrack18_384x288.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/posetrack18/td-hm_res50_8xb64-20e_posetrack18-256x192.py| HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/res50_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_w48_posetrack18_384x288_posewarper_stage1.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_w48_posetrack18_384x288_posewarper_stage2.py | https://download.openmmlab.com/mmpose/top_down/posewarper/hrnet_w48_posetrack18_384x288_posewarper_stage1-08b632aa_20211130.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/scnet50_coco_wholebody_face_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_full_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_full_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_lower_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_lower_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_upper_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_upper_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_full_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_full_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_lower_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_lower_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_upper_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_upper_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/scnet50_coco_wholebody_hand_256x256.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w32_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w32_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w48_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w48_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w32_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w32_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w48_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w48_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_384x288.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_384x288_dark_plus.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288_dark-741844ba_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/tcformer_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/halpe/hrnet_w32_halpe_256x192.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md | HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/halpe/hrnet_w48_halpe_384x288_dark_plus.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288_dark-741844ba_20200812.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/docs/en/2d_animal_demo.md| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/detector_node.py | https://mmdetection.readthedocs.io/en | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/detector_node.py | https://download.openmmlab.com | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/pose_estimator_node.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/hand_gesture_node.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/docs/en/2d_animal_demo.md| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/pose_tracker_node.py | https://mmdetection.readthedocs.io/en | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/pose_tracker_node.py | https://download.openmmlab.com | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/model_nodes/pose_tracker_node.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/topdown_heatmap/README.md| HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/visualizer_nodes/sunglasses_effect_node.py | https://user-images.githubusercontent.com/15977946/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/setup.py| HRNet_MMPose_for_PyTorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/setup.py| HRNet_MMPose_for_PyTorch/setup.py | openmmlab@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.git/config| HRNet_MMPose_for_PyTorch/setup.py | https://github.com/open-mmlab/mmpose | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/body3d_multiview_detect_and_regress_img_demo.py | https://download.openmmlab.com/mmpose/demo/panoptic_body3d_demo.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/bottom_up_img_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/body3d_two_stage_img_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/body3d_two_stage_img_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/bottom_up_pose_tracking_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/bottom_up_video_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/face_img_demo.py | https://github.com/ageitgey/face_recognition | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/body3d_two_stage_video_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/face_img_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/face_video_demo.py | https://github.com/ageitgey/face_recognition | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/body3d_two_stage_video_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/face_video_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_img_demo.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_img_demo_with_mmdet.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_pose_tracking_demo_with_mmdet.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_pose_tracking_demo_with_mmtracking.py | https://github.com/open-mmlab/mmtracking/pull/300 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_video_demo_full_frame_without_det.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_pose_tracking_demo_with_mmtracking.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_video_demo_full_frame_without_det_gpuaccel.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.circleci/docker/Dockerfile| HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.circleci/docker/Dockerfile| HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/demo/top_down_video_demo_with_mmdet.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/202 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose_ap10k.md | HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.git/config| HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://github.com/open-mmlab/mmpose.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/202 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/202 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.circleci/test.yml| HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open- | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/656 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.circleci/test.yml| HRNet_MMPose_for_PyTorch/.dev_scripts/develop/create_ceph_configs.py | https://download | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.github/pull_request_template.md| HRNet_MMPose_for_PyTorch/.dev_scripts/github/update_model_index.py | https://github.com/open-mmlab/mmpose/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py| HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py | https://download.openmmlab.com/mmtracking/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/topdown_heatmap/ap10k/cspnext-m_udp_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/demo/mmdetection_cfg/ssdlite_mobilenetv2_scratch_600e_onehand.py | https://download.openmmlab.com/mmdetection/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py| HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py | https://download.openmmlab.com/mmtracking/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/gesture_recognition.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/gesture_recognition.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py| HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/tracktor_faster-rcnn_r50_fpn_4e_mot17-private.py | https://download.openmmlab.com/mmtracking/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/mmtracking_cfg/tracktor_faster-rcnn_r50_fpn_4e_mot17-private.py| HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/tracktor_faster-rcnn_r50_fpn_4e_mot17-private.py | https://download.openmmlab.com/mmtracking/mot/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_estimation.py | https://download.openmmlab.com | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/aic/hrnet_aic.md| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_estimation.py | https://download.openmmlab.com/mmpose/top_down/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/topdown_heatmap/animalpose/hrnet_animalpose.md| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_estimation.py | https://download.openmmlab.com/mmpose/animal/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_tracking.py | https://download.openmmlab.com | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_tracking.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/MMPose_Tutorial.ipynb| HRNet_MMPose_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/MMPose_Tutorial.ipynb| HRNet_MMPose_for_PyTorch/docker/serve/Dockerfile_mmcls | https://download.openmmlab.com/mmcv/dist/cu | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/conf.py| HRNet_MMPose_for_PyTorch/docs/en/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/MMPose_Tutorial.ipynb| HRNet_MMPose_for_PyTorch/docs/en/conf.py | https://colab.research.google.com/github/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.git/config| HRNet_MMPose_for_PyTorch/docs/en/conf.py | https://github.com/open-mmlab/mmpose | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/en/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.circleci/test.yml| HRNet_MMPose_for_PyTorch/docs/en/stats.py | https://download | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.circleci/test.yml| HRNet_MMPose_for_PyTorch/docs/en/stats.py | https://download | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/conf.py| HRNet_MMPose_for_PyTorch/docs/zh_cn/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/demo/MMPose_Tutorial.ipynb| HRNet_MMPose_for_PyTorch/docs/zh_cn/conf.py | https://colab.research.google.com/github/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/.git/config| HRNet_MMPose_for_PyTorch/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmpose | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/docs/zh_cn/merge_docs.sh | https://github.com/open-mmlab/mmpose/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/make.bat| HRNet_MMPose_for_PyTorch/docs/en/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference_3d.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference_3d.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/inference_tracking.py | https://github.com/open-mmlab/mmpose/pull/663 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/make.bat| HRNet_MMPose_for_PyTorch/docs/zh_cn/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/README.md | HRNet_MMPose_for_PyTorch/mmpose/apis/train.py | https://github.com/open-mmlab/mmdetection/issues/6339 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/mmpose/apis/train.py | https://github.com/open-mmlab/mmpose/pull/1157 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/datasets/builder.py| HRNet_MMPose_for_PyTorch/mmpose/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/utils/hooks.py| HRNet_MMPose_for_PyTorch/mmpose/utils/hooks.py | https://stackoverflow.com/questions/31174295/getattr-and-setattr-on-nested-objects | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/utils/setup_env.py| HRNet_MMPose_for_PyTorch/mmpose/utils/setup_env.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_pvt-s_8xb64-210e_coco-256x192.py| HRNet_MMPose_for_PyTorch/tests/test_backbones/test_pvt.py | https://github.com/whai362/PVT/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/td-hm_pvt-s_8xb64-210e_coco-256x192.py| HRNet_MMPose_for_PyTorch/tests/test_backbones/test_pvt.py | https://github.com/whai362/PVT/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/tests/test_backbones/test_tcformer.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/_base_/datasets/fly.py| HRNet_MMPose_for_PyTorch/tests/test_datasets/test_dataset_info.py | https://github.com/jgraving/DeepPoseKit-Data | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/animal_2d_keypoint/rtmpose/ap10k/rtmpose-m_8xb64-210e_ap10k-256x256.py| HRNet_MMPose_for_PyTorch/tests/test_models/test_bottom_up_forward.py | https://download.openmmlab.com/mmpose/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/mmpose/models/necks/posewarper_neck.py| HRNet_MMPose_for_PyTorch/tests/test_necks/test_posewarper_neck.py | https://github.com/open-mmlab/mmcv/issues/1440 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/tools/dataset_converters/preprocess_h36m.py| HRNet_MMPose_for_PyTorch/tools/dataset/preprocess_h36m.py | https://github.com/anibali/h36m-fetch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/tools/deployment/mmpose2torchserve.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/tools/deployment/test_torchserver.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/docs/en/notes/changelog.md| HRNet_MMPose_for_PyTorch/tools/deployment/pytorch2onnx.py | https://github.com/open-mmlab/mmdeplo | 源码实现 | -| 开发引入 | /| HRNet_MMPose_for_PyTorch/requirements/docs.txt | https://github.com/gaotongxiao/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | -| 开发引入 | /| HRNet_MMPose_for_PyTorch/requirements/poseval.txt | https://github.com/svenkreiss/poseval.git | 相关依赖 | -| 开发引入 | /| HRNet_MMPose_for_PyTorch/requirements/readthedocs.txt | https://github.com/svenkreiss/poseval.git | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.circleci/test.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.circleci/test.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.circleci/test.yml | https://download.openmmlab.com/mmcv/dist/cpu/torch<< parameters.torch >>/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.circleci/test.yml | https://download.openmmlab.com/mmcv/dist/<< parameters.cuda >>/torch<< parameters.torch >>/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_res50_coco_256x192-cc43b466_20210624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_mpii_256x256-a54b6af5_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_coco_384x288-9cacd0ea_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_coco_256x192-6920f829_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_mpii_256x256-b4c2d184_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_coco_384x288-0b6e631b_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_coco_256x192-6d348ef9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/rsn/rsn18_coco_256x192-72f4b4a7_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_384x288_dark-33d3e5e5_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192_dark-43379d20_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_384x288_dark-d3b8ebd7_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_256x192_dark-ab4840d5_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_384x288_dark-cb45c88d_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_256x192_dark-64d433e6_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/posewarper/hrnet_w48_posetrack18_384x288_posewarper_stage2-4abf88db_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/mspn50_coco_256x192-8fbfb5d0_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/4xmspn50_coco_256x192-7b837afb_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/3xmspn50_coco_256x192-e348f18e_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/2xmspn50_coco_256x192-c8765a5c_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_mpii_256x256-faae8bd8_20210622.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_384x288-a3aef5c4_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_256x192-4176555b_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet18_mpii_256x256-cabd7984_20210623.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_mpii_256x256_dark-0decd39f_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_wholebody_384x288_dark-f5726563_20200918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288_dark-e881a4b6_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192_dark-8cba3197_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_mpii_256x256_dark-f1601c5b_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_wholebody_256x192_dark-469327ef_20200922.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288-d9f0d786_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288_dark-307dafc2_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192_dark-07f147eb_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_small_coco_384x288-98d237ed_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_small_coco_256x192-5310d898_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_base_coco_384x288-ecf0758d_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_base_coco_256x192-6f5f1169_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_mpii_384x384-04090bc3_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_mpii_256x256-ae358435_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_coco_384x384-be91ba2b_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_coco_256x256-4ec713ba_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_mpii_256x256-c63cd0b6_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_coco_256x192-f6de6c0e_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_coco_256x192-f6de6c0e_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_mpii_256x256-15f5e6f9_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_coco_256x192-7df89a88_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res101_mpii_256x256-87516a90_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res101_coco_256x192-2f247111_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub1_368x368-2d2585c9_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_coco_256x192-aa4ba095_20200817.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand3d/internet/res50_intehand3dv1.0_all_256x256-42b7f2ac_20210702.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/udp/hrnetv2_w18_rhd2d_256x256_udp-63ba6007_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/udp/hrnetv2_w18_panoptic_256x256_udp-f9e15948_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/udp/hrnetv2_w18_onehand10k_256x256_udp-0d1b515d_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/hrnetv2/hrnetv2_w18_rhd2d_256x256-95b20dd8_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/hrnetv2/hrnetv2_w18_onehand10k_256x256-30bc9c6b_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_rhd2d_256x256_dark-4df3a347_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_panoptic_256x256_dark-1f1e4b74_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_onehand10k_256x256_dark-a2f80c64_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_onehand10k_256x256_dark-a2f80c64_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_wflw_256x256-2bf032a6_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_wflw_256x256_awing-5af5055c_20211212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_aflw_256x256-f2bbc62b_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/deeppose/deeppose_res50_wflw_256x256_wingloss-f82a5e53_20210303.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/deeppose/deeppose_res50_wflw_256x256_softwingloss-4d34f22a_20211212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/darkpose/hrnetv2_w18_wflw_256x256_dark-3f8e0c2c_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/face/darkpose/hrnetv2_w18_aflw_256x256_dark-219606c0_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_640x640-2046f9cb_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_512x512-5521bead_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/res152_coco_512x512-364eb38d_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/res101_coco_512x512-e0c95157_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/mobilenetv2_coco_512x512-4d96e309_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512-bcb8c247_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512_udp-91663bf9_20210220.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_wholebody_512x512_plus-934f08aa_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_512x512-60fedcbc_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_512x512_udp-7cad61ef_20210222.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_crowdpose_512x512-1aa4a132_20201017.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_crowdpose_512x512-1aa4a132_20201017.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_wholebody_512x512_plus-2fa137ab_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_640x640-a22fe938_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_512x512-8ae85183_20200713.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_512x512_udp-8cc64794_20210222.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_aic_512x512-9a674c33_20210130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_aic_512x512-9a674c33_20210130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn64x64x64_cpn80x80x20_shelf_cam5-f406fefe_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn64x64x64_cpn80x80x20_panoptic_cam5-545c150e_20211103.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn64x64x64_cpn80x80x20_campus_cam3-d8decbf7_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn32x32x32_cpn80x80x20_campus_cam3-3ecee30e_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn32x32x32_cpn48x48x12_shelf_cam5-24721ec7_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_mpi-inf-3dhp_1frame_fullconv_supervised_gt-d6ed21ef_20210603.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_81frames_fullconv_supervised-1f2d1104_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_27frames_fullconv_supervised-fe8fbba9_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_27frames_fullconv_semi-supervised-54aef83b_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_27frames_fullconv_semi-supervised_cpn_ft-71be9cde_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_243frames_fullconv_supervised_cpn_ft-88f5abbb_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_243frames_fullconv_supervised_cpn_ft-88f5abbb_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_1frame_fullconv_supervised_cpn_ft-5c3afaed_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/simple_baseline/simplebaseline3d_mpi-inf-3dhp-b75546f6_20210603.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/body3d/simple_baseline/simple3Dbaseline_h36m-f0ad73a4_20210419.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_res50_coco_256x192-cc43b466_20210624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_mbv3_coco_256x192-7018731a_20211122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/vgg/vgg16_bn_coco_256x192-7e7c58d6_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/shufflenetv2/shufflenetv2_coco_384x288-fb38ac3a_20200921.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/shufflenetv2/shufflenetv2_coco_256x192-0aba71c7_20200921.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/shufflenetv1/shufflenetv1_coco_384x288-b2930b24_20200804.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/shufflenetv1/shufflenetv1_coco_256x192-353bc02c_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet50_coco_384x288-bc0b7680_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet50_coco_256x192-25058b66_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet152_coco_384x288-58b23ee8_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet152_coco_256x192-1c628d79_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet101_coco_384x288-48de1709_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet101_coco_256x192-83f29c4d_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_coco_384x288-9cacd0ea_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_coco_256x192-6920f829_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_coco_384x288-0b6e631b_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_coco_256x192-6d348ef9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/rsn/rsn50_coco_256x192-72ffe709_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/rsn/rsn18_coco_256x192-72f4b4a7_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/rsn/3xrsn50_coco_256x192-58f57a68_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/rsn/2xrsn50_coco_256x192-50648f0e_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext50_coco_384x288-412c848f_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext50_coco_256x192-dcff15f6_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext152_coco_384x288-806176df_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext152_coco_256x192-102449aa_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext101_coco_384x288-f5eabcd6_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext101_coco_256x192-c7eba365_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d50_coco_384x288-01f3fbb9_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d50_coco_256x192-a243b840_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d152_coco_384x288-626c622d_20200730.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d152_coco_256x192-c4df51dc_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d101_coco_384x288-5f9e421d_20200730.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d101_coco_256x192-5bd08cab_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_384x288-e6f795e9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_384x288-3860d4c9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_256x192-f6e307c2_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_384x288-8c71bdc9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_256x192-6e6babf0_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest50_coco_384x288-dcd20436_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest50_coco_256x192-6e65eece_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest269_coco_384x288-b142b9fb_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest269_coco_256x192-2a7882ac_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest200_coco_384x288-b5bb76cb_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest200_coco_256x192-db007a48_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest101_coco_384x288-80660658_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest101_coco_256x192-2ffcdc9d_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/mspn50_coco_256x192-8fbfb5d0_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/4xmspn50_coco_256x192-7b837afb_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/3xmspn50_coco_256x192-e348f18e_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/mspn/2xmspn50_coco_256x192-c8765a5c_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/mobilenetv2/mobilenetv2_coco_384x288-26be4816_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/mobilenetv2/mobilenetv2_coco_256x192-d1e58e7b_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_384x288-a3aef5c4_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_256x192-4176555b_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288-d9f0d786_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_coco_384x384-be91ba2b_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_coco_256x256-4ec713ba_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_coco_256x192-f6de6c0e_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_coco_256x192-7df89a88_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res101_coco_256x192-2f247111_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_coco_384x288-80feb4bc_20200821.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_coco_256x192-aa4ba095_20200817.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/top_down/alexnet/alexnet_coco_256x192-a7b1fd15_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/face/hourglass_ae/hourglass_ae_coco_512x512-90af499f_20210920.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_640x640-2046f9cb_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_512x512-5521bead_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/res152_coco_512x512-364eb38d_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/res101_coco_512x512-e0c95157_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/mobilenetv2_coco_512x512-4d96e309_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_coco_512x512-cf72fcdf_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512-bcb8c247_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_512x512-60fedcbc_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_640x640-a22fe938_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_cfg_flops_speed.yaml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_512x512-8ae85183_20200713.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_res50_coco_256x192-cc43b466_20210624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth # path or url to the config file | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/mobilenetv2/mobilenetv2_coco_256x192-d1e58e7b_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_256x192-4176555b_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_mpii_256x256-6c4f923f_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192_dark-07f147eb_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.dev_scripts/benchmark/benchmark_regression_cfg_tmpl.yaml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512-bcb8c247_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/ISSUE_TEMPLATE/config.yml | https://mmpose.readthedocs.io/en/latest/faq.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/ISSUE_TEMPLATE/config.yml | https://mmpose.readthedocs.io/en/latest/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/lts/1.8/torch_lts.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cpu/torch1.8/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cu102/${{matrix.torch_version}}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cu101/${{matrix.torch_version}}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cpu/${{matrix.torch_version}}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/300w.py | https://ibug.doc.ic.ac.uk/resources/300-W/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/campus.py | http://campar.in.tum.de/Chair/MultiHumanPose | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/coco.py | http://gvv.mpi-inf.mpg.de/3dhp-dataset | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/cofw.py | http://www.vision.caltech.edu/xpburgos/ICCV13/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/freihand2d.py | https://lmb.informatik.uni-freiburg.de/projects/freihand/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/horse10.py | http://www.mackenziemathislab.org/horse10 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/jhmdb.py | http://www.pri.kyoto-u.ac.jp/datasets/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/macaque.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/mpi_inf_3dhp.py | http://jhmdb.is.tue.mpg.de/dataset | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/mpii.py | http://human-pose.mpi-inf.mpg.de/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/nvgesture.py | https://research.nvidia.com/publication/2016-06_online- | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/onehand10k.py | https://www.yangangwang.com/papers/WANG-MCC-2018-10.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/panoptic_body3d.py | http://domedb.perception.cs.cmu.edu | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/panoptic_hand2d.py | http://domedb.perception.cs.cmu.edu/handdb.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/posetrack18.py | https://posetrack.net/users/download.php | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/rhd2d.py | https://lmb.informatik.uni-freiburg.de/resources/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/datasets/shelf.py | http://campar.in.tum.de/Chair/MultiHumanPose | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t16_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t32_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t64_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/_base_/filters/smoothnet_t8_h36m.py | https://download.openmmlab.com/mmpose/plugin/smoothnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_animalpose.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_animalpose_256x256-34644726_20210426.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_animalpose.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_animalpose_256x256-1aa7f075_20210426.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_animalpose.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_w32_animalpose_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/hrnet_w48_animalpose_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/resnet_animalpose.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/resnet_animalpose.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_animalpose_256x256-e1f30bff_20210426.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/resnet_animalpose.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_animalpose_256x256-a0a7506c_20210426.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/animalpose/resnet_animalpose.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_animalpose_256x256-85563f4a_20210426.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_ap10k.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_ap10k_256x256-d95ab412_20211029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_ap10k.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_ap10k_256x256-18aac840_20211029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_ap10k.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_w32_ap10k_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/hrnet_w48_ap10k_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/resnet_ap10k.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/resnet_ap10k.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_ap10k_256x256-35760eb8_20211029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/ap10k/resnet_ap10k.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_ap10k_256x256-9edfafb9_20211029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_atrw.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_atrw_256x256-ac088892_20210414.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_atrw.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_atrw_256x256-f027f09a_20210414.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_atrw.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_w32_atrw_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/hrnet_w48_atrw_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/resnet_atrw.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/resnet_atrw.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_atrw_256x256-546c4594_20210414.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/resnet_atrw.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_atrw_256x256-2bb8e162_20210414.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/atrw/resnet_atrw.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_atrw_256x256-da93f371_20210414.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/fly/resnet_fly.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/fly/resnet_fly.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_fly_192x192-5d0ee2d9_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/fly/resnet_fly.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_fly_192x192-fcafbd5a_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/fly/resnet_fly.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_fly_192x192-41a7a6cc_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_horse10_256x256_split3-0232ec47_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_horse10_256x256_split2-8ef72b5d_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_horse10_256x256_split1-3c950d3b_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_horse10_256x256_split3-4db47400_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_horse10_256x256_split2-04840523_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_horse10_256x256_split1-401d901a_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_horse10.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w32_horse10_256x256-split1.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w32_horse10_256x256-split2.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w32_horse10_256x256-split3.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w48_horse10_256x256-split1.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w48_horse10_256x256-split2.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/hrnet_w48_horse10_256x256-split3.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_horse10_256x256_split3-9637d4eb_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_horse10_256x256_split2-65e2a508_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_horse10_256x256_split1-3a3dc37e_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_horse10_256x256_split3-c957dac5_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_horse10_256x256_split2-3b3404a3_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_horse10_256x256_split1-7e81fe2d_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_horse10_256x256_split3-2eea5bb1_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_horse10_256x256_split2-30e2fa87_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/horse10/resnet_horse10.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_horse10_256x256_split1-1b7c259c_20210405.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/locust/resnet_locust.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/locust/resnet_locust.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_locust_160x160-9efca22b_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/locust/resnet_locust.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_locust_160x160-4ea9b372_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/locust/resnet_locust.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_locust_160x160-d77986b3_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_macaque.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w48_macaque_256x192-9b34b02a_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_macaque.yml | https://download.openmmlab.com/mmpose/animal/hrnet/hrnet_w32_macaque_256x192-f7e9e04f_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_macaque.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_w32_macaque_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/hrnet_w48_macaque_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/resnet_macaque.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/resnet_macaque.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_macaque_256x192-98f1dd3a_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/resnet_macaque.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_macaque_256x192-c42abc02_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/macaque/resnet_macaque.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_macaque_256x192-e3b9c6bb_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/zebra/resnet_zebra.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/zebra/resnet_zebra.yml | https://download.openmmlab.com/mmpose/animal/resnet/res50_zebra_160x160-5a104833_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/zebra/resnet_zebra.yml | https://download.openmmlab.com/mmpose/animal/resnet/res152_zebra_160x160-05de71dd_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/animal/2d_kpt_sview_rgb_img/topdown_heatmap/zebra/resnet_zebra.yml | https://download.openmmlab.com/mmpose/animal/resnet/res101_zebra_160x160-e8cb2010_20210407.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_aic.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_aic_512x512-9a674c33_20210130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_aic.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_aic_512x512-9a674c33_20210130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_aic.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_HigherHRNet_Scale-Aware_Representation_Learning_for_Bottom-Up_Human_Pose_Estimation_CVPR_2024_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_w32_aic_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/higherhrnet_w32_aic_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/hrnet_aic.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_aic_512x512-77e2a98a_20210131.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/hrnet_aic.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_aic_512x512-77e2a98a_20210131.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/hrnet_aic.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/aic/hrnet_w32_aic_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_512x512-60fedcbc_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_512x512-60fedcbc_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_640x640-a22fe938_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_640x640-a22fe938_20200712.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_512x512-8ae85183_20200713.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_512x512-8ae85183_20200713.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_HigherHRNet_Scale-Aware_Representation_Learning_for_Bottom-Up_Human_Pose_Estimation_CVPR_2023_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_512x512_udp-7cad61ef_20210222.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_512x512_udp-8cc64794_20210222.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_udp_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_HigherHRNet_Scale-Aware_Representation_Learning_for_Bottom-Up_Human_Pose_Estimation_CVPR_2022_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w32_coco_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/higherhrnet_w48_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hourglass_ae_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hourglass_ae/hourglass_ae_coco_512x512-90af499f_20210920.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hourglass_ae_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hourglass_ae/hourglass_ae_coco_512x512-90af499f_20210920.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hourglass_ae_coco.yml | https://arxiv.org/abs/1611.05424 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_coco_512x512-cf72fcdf_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_coco_512x512-cf72fcdf_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512-bcb8c247_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512-bcb8c247_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_coco.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_coco_512x512_udp-de08fd8c_20210222.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_512x512_udp-91663bf9_20210220.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_udp_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Huang_The_Devil_Is_in_the_Details_Delving_Into_Unbiased_Data_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w32_coco_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/hrnet_w48_coco_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/mobilenetv2_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/mobilenetv2_coco_512x512-4d96e309_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/mobilenetv2_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/mobilenetv2_coco_512x512-4d96e309_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/mobilenetv2_coco.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_640x640-2046f9cb_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_640x640-2046f9cb_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_512x512-5521bead_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res50_coco_512x512-5521bead_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res152_coco_512x512-364eb38d_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res152_coco_512x512-364eb38d_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res101_coco_512x512-e0c95157_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/res101_coco_512x512-e0c95157_20200816.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/coco/resnet_coco.yml | https://arxiv.org/abs/1611.05424 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_crowdpose_512x512-1aa4a132_20201017.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_crowdpose_512x512-1aa4a132_20201017.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_crowdpose.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_HigherHRNet_Scale-Aware_Representation_Learning_for_Bottom-Up_Human_Pose_Estimation_CVPR_2021_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w32_crowdpose_640x640_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w48_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/crowdpose/higherhrnet_w48_crowdpose_512x512_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/mhp/hrnet_mhp.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_mhp_512x512-85a6ab6f_20201229.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/mhp/hrnet_mhp.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_mhp_512x512-85a6ab6f_20201229.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/mhp/hrnet_mhp.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/associative_embedding/mhp/hrnet_w48_mhp_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/cid/hrnet_w48_coco_512x512-af545767_20221109.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/cid/hrnet_w32_coco_512x512-867b9659_20220928.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_coco.yml | https://openaccess.thecvf.com/content/CVPR2022/html/Wang_Contextual_Instance_Decoupling_for_Robust_Multi-Person_Pose_Estimation_CVPR_2022_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/cid/coco/hrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_coco_256x192-f6de6c0e_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_coco_256x192-7df89a88_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res101_coco_256x192-2f247111_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_coco.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_rle_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_coco_256x192_rle-2ea9bb4a_20220616.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_rle_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_coco_384x288_rle-b77c4c37_20220624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_rle_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_coco_256x192_rle-c05bdccf_20220615.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_rle_coco.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res101_coco_256x192_rle-16c3d461_20220615.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/coco/resnet_rle_coco.yml | https://arxiv.org/abs/2107.11291 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/mpii/resnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_mpii_256x256-c63cd0b6_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/mpii/resnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res152_mpii_256x256-15f5e6f9_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/mpii/resnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res101_mpii_256x256-87516a90_20210205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/mpii/resnet_mpii.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/mpii/resnet_rle_mpii.yml | https://download.openmmlab.com/mmpose/top_down/deeppose/deeppose_res50_mpii_256x256_rle-5f92a619_20220504.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/deeppose/mpii/resnet_rle_mpii.yml | https://arxiv.org/abs/2107.11291 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w48_coco_640x640-8854b2f1_20220930.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w48_coco_640x640-8854b2f1_20220930.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w32_coco_512x512-2a3056de_20220928.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w32_coco_512x512-2a3056de_20220928.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_coco.yml | https://arxiv.org/abs/2104.02300 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w48_crowdpose_640x640-ef6b6040_20220930.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w48_crowdpose_640x640-ef6b6040_20220930.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w32_crowdpose_512x512-685aff75_20220924.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/bottom_up/dekr/hrnet_w32_crowdpose_512x512-685aff75_20220924.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_crowdpose.yml | https://arxiv.org/abs/2104.02300 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w48_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/dekr/crowdpose/hrnet_w48_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_aic.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_aic_256x192-30a4e465_20200826.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_aic.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w32_aic_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w32_aic_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w48_aic_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/hrnet_w48_aic_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/resnet_aic.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/aic/resnet_aic.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_aic_256x192-79b35445_20200826.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/alexnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/alexnet/alexnet_coco_256x192-a7b1fd15_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/cpm_coco.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_coco_384x288-80feb4bc_20200821.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/cpm_coco.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_coco_256x192-aa4ba095_20200817.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/cpm_coco.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/Wei_Convolutional_Pose_Machines_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hourglass_coco.yml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_coco_384x384-be91ba2b_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hourglass_coco.yml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_coco_256x256-4ec713ba_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hourglass_coco.yml | https://link.springer.com/chapter/10.1007/978-3-319-46484-8_29 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_base_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_base_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_small_coco_384x288-98d237ed_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_small_coco_256x192-5310d898_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_base_coco_384x288-ecf0758d_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrformer/hrformer_base_coco_256x192-6f5f1169_20220316.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_small_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrformer_small_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_augmentation_coco.yml | https://download.openmmlab.com/mmpose/top_down/augmentation/hrnet_w32_coco_256x192_photometric-308cf591_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_augmentation_coco.yml | https://download.openmmlab.com/mmpose/top_down/augmentation/hrnet_w32_coco_256x192_gridmask-868180df_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_augmentation_coco.yml | https://download.openmmlab.com/mmpose/top_down/augmentation/hrnet_w32_coco_256x192_coarsedropout-0f16a0ce_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_augmentation_coco.yml | https://www.mdpi.com/649002 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288-d9f0d786_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_coco.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_dark_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288_dark-e881a4b6_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192_dark-8cba3197_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288_dark-307dafc2_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192_dark-07f147eb_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_fp16_coco.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192_fp16_dynamic-290efc2e_20210430.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_fp16_coco.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w48_coco_384x288_udp-0f89c63e_20210223.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w48_coco_256x192_udp-2554c524_20210223.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w32_coco_384x288_udp-e97c1a0f_20210223.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w32_coco_256x192_udp-aba0be42_20210220.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.yml | https://download.openmmlab.com/mmpose/top_down/udp/hrnet_w32_coco_256x192_udp_regress-be2dbba4_20210222.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_udp_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Huang_The_Devil_Is_in_the_Details_Delving_Into_Unbiased_Data_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_coarsedropout.py | https://download.openmmlab.com/mmpose/top_down/hrnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_gridmask.py | https://download.openmmlab.com/mmpose/top_down/hrnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_photometric.py | https://download.openmmlab.com/mmpose/top_down/hrnet/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_256x192_udp_regress.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w32_coco_384x288_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnet_w48_coco_384x288_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/hrnetv2_w64_coco_384x288_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/litehrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_384x288-a3aef5c4_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/litehrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_coco_256x192-4176555b_20210626.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/litehrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet18_coco_384x288-8d4dac48_20211230.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/litehrnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet18_coco_256x192-6bace359_20211230.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/litehrnet_coco.yml | https://arxiv.org/abs/2104.06403 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mobilenetv2_coco.yml | https://download.openmmlab.com/mmpose/top_down/mobilenetv2/mobilenetv2_coco_384x288-26be4816_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mobilenetv2_coco.yml | https://download.openmmlab.com/mmpose/top_down/mobilenetv2/mobilenetv2_coco_256x192-d1e58e7b_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mobilenetv2_coco.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mspn_coco.yml | https://download.openmmlab.com/mmpose/top_down/mspn/mspn50_coco_256x192-8fbfb5d0_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mspn_coco.yml | https://download.openmmlab.com/mmpose/top_down/mspn/4xmspn50_coco_256x192-7b837afb_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mspn_coco.yml | https://download.openmmlab.com/mmpose/top_down/mspn/3xmspn50_coco_256x192-e348f18e_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mspn_coco.yml | https://download.openmmlab.com/mmpose/top_down/mspn/2xmspn50_coco_256x192-c8765a5c_20201123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/mspn_coco.yml | https://arxiv.org/abs/1901.00148 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/pvt_coco.yml | https://download.openmmlab.com/mmpose/top_down/pvt/pvtv2_b2_coco_256x192-b4212737_20220501.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/pvt_coco.yml | https://download.openmmlab.com/mmpose/top_down/pvt/pvt_small_coco_256x192-4324a49d_20220501.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/pvt_coco.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest50_coco_384x288-dcd20436_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest50_coco_256x192-6e65eece_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest269_coco_384x288-b142b9fb_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest269_coco_256x192-2a7882ac_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest200_coco_384x288-b5bb76cb_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest200_coco_256x192-db007a48_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest101_coco_384x288-80660658_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnest/resnest101_coco_256x192-2ffcdc9d_20210320.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnest_coco.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_384x288-e6f795e9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_384x288-3860d4c9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_256x192-f6e307c2_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_384x288-8c71bdc9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_256x192-6e6babf0_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_384x288_dark-33d3e5e5_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192_dark-43379d20_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_384x288_dark-d3b8ebd7_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_256x192_dark-ab4840d5_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_384x288_dark-cb45c88d_20210203.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_dark_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_256x192_dark-64d433e6_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_fp16_coco.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnet_fp16_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192_fp16_dynamic-6edb79f3_20210430.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d50_coco_384x288-01f3fbb9_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d50_coco_256x192-a243b840_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d152_coco_384x288-626c622d_20200730.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d152_coco_256x192-c4df51dc_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d101_coco_384x288-5f9e421d_20200730.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d101_coco_256x192-5bd08cab_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnetv1d_coco.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/He_Bag_of_Tricks_for_Image_Classification_with_Convolutional_Neural_Networks_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext50_coco_384x288-412c848f_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext50_coco_256x192-dcff15f6_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext152_coco_384x288-806176df_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext152_coco_256x192-102449aa_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext101_coco_384x288-f5eabcd6_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext101_coco_256x192-c7eba365_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/resnext_coco.yml | http://openaccess.thecvf.com/content_cvpr_2017/html/Xie_Aggregated_Residual_Transformations_CVPR_2017_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/rsn_coco.yml | https://download.openmmlab.com/mmpose/top_down/rsn/rsn50_coco_256x192-72ffe709_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/rsn_coco.yml | https://download.openmmlab.com/mmpose/top_down/rsn/rsn18_coco_256x192-72f4b4a7_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/rsn_coco.yml | https://download.openmmlab.com/mmpose/top_down/rsn/3xrsn50_coco_256x192-58f57a68_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/rsn_coco.yml | https://download.openmmlab.com/mmpose/top_down/rsn/2xrsn50_coco_256x192-50648f0e_20201127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/rsn_coco.yml | https://link.springer.com/chapter/10.1007/978-3-030-58580-8_27 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_coco_384x288-9cacd0ea_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_coco_256x192-6920f829_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_coco_384x288-0b6e631b_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_coco_256x192-6d348ef9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet_coco.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Liu_Improving_Convolutional_Networks_With_Self-Calibrated_Convolutions_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet101_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet101_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet50_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/scnet50_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet50_coco_384x288-bc0b7680_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet50_coco_256x192-25058b66_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet152_coco_384x288-58b23ee8_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet152_coco_256x192-1c628d79_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet101_coco_384x288-48de1709_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet101_coco_256x192-83f29c4d_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/seresnet_coco.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Hu_Squeeze-and-Excitation_Networks_CVPR_2018_paper | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/shufflenetv1_coco.yml | https://download.openmmlab.com/mmpose/top_down/shufflenetv1/shufflenetv1_coco_384x288-b2930b24_20200804.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/shufflenetv1_coco.yml | https://download.openmmlab.com/mmpose/top_down/shufflenetv1/shufflenetv1_coco_256x192-353bc02c_20200727.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/shufflenetv1_coco.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Zhang_ShuffleNet_An_Extremely_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/shufflenetv2_coco.yml | https://download.openmmlab.com/mmpose/top_down/shufflenetv2/shufflenetv2_coco_384x288-fb38ac3a_20200921.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/shufflenetv2_coco.yml | https://download.openmmlab.com/mmpose/top_down/shufflenetv2/shufflenetv2_coco_256x192-0aba71c7_20200921.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/shufflenetv2_coco.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Ningning_Light-weight_CNN_Architecture_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://download.openmmlab.com/mmpose/top_down/swin/swin_t_p4_w7_coco_256x192-eaefe010_20220503.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://download.openmmlab.com/mmpose/top_down/swin/swin_l_p4_w7_coco_384x288-c36b7845_20220705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://download.openmmlab.com/mmpose/top_down/swin/swin_l_p4_w7_coco_256x192-642a89db_20220705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://download.openmmlab.com/mmpose/top_down/swin/swin_b_p4_w7_fpn_coco_256x192-a3b91c45_20220705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://download.openmmlab.com/mmpose/top_down/swin/swin_b_p4_w7_coco_384x288-3abf54f9_20220705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://download.openmmlab.com/mmpose/top_down/swin/swin_b_p4_w7_coco_256x192-7432be9e_20220705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/swin_coco.yml | https://arxiv.org/abs/2103.14030 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/vgg_coco.yml | https://download.openmmlab.com/mmpose/top_down/vgg/vgg16_bn_coco_256x192-7e7c58d6_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/vgg_coco.yml | https://arxiv.org/abs/1409.1556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/vipnas_coco.yml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_res50_coco_256x192-cc43b466_20210624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/vipnas_coco.yml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_mbv3_coco_256x192-7018731a_20211122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/coco/vipnas_coco.yml | https://arxiv.org/abs/2105.10154 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_crowdpose.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_crowdpose_256x192-960be101_20201227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_crowdpose.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w32_crowdpose_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w32_crowdpose_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w48_crowdpose_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/hrnet_w48_crowdpose_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/resnet_crowdpose.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/resnet_crowdpose.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_crowdpose_256x192-c6a526b6_20201227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/resnet_crowdpose.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_crowdpose_256x192-dbd49aba_20201227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/resnet_crowdpose.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_crowdpose_320x256-c88c512a_20201227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/crowdpose/resnet_crowdpose.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_crowdpose_256x192-8f5870f4_20201227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_h36m.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_h36m_256x256-78e88d08_20210621.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_h36m.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_h36m_256x256-d3206675_20210621.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_h36m.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_w32_h36m_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/h36m/hrnet_w48_h36m_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub3_368x368-49337155_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub3_368x368-49337155_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub2_368x368-fc742f1f_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub2_368x368-fc742f1f_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub1_368x368-2d2585c9_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_jhmdb_sub1_368x368-2d2585c9_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/Wei_Convolutional_Pose_Machines_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb_sub1_368x368.py | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb_sub2_368x368.py | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/cpm_jhmdb_sub3_368x368.py | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_2deconv_jhmdb_sub1_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_2deconv_jhmdb_sub2_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_2deconv_jhmdb_sub3_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_jhmdb_sub1_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_jhmdb_sub2_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/res50_jhmdb_sub3_256x256.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_jhmdb_sub3_256x256-c4ec1a0b_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_jhmdb_sub3_256x256-c4ec1a0b_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_jhmdb_sub2_256x256-83d606f7_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_jhmdb_sub2_256x256-83d606f7_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_jhmdb_sub1_256x256-932cb3b4_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_jhmdb_sub1_256x256-932cb3b4_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_2deconv_jhmdb_sub3_256x256-c4bc2ddb_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_2deconv_jhmdb_sub3_256x256-c4bc2ddb_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_2deconv_jhmdb_sub2_256x256-f63af0ff_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_2deconv_jhmdb_sub2_256x256-f63af0ff_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_2deconv_jhmdb_sub1_256x256-f0574a52_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/jhmdb/resnet_jhmdb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_2deconv_jhmdb_sub1_256x256-f0574a52_20201122.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mhp/resnet_mhp.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mhp/resnet_mhp.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mhp_256x192-28c5b818_20201229.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/cpm_mpii.yml | https://download.openmmlab.com/mmpose/top_down/cpm/cpm_mpii_368x368-116e62b8_20200822.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/cpm_mpii.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/Wei_Convolutional_Pose_Machines_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hourglass_mpii.yml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_mpii_384x384-04090bc3_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hourglass_mpii.yml | https://download.openmmlab.com/mmpose/top_down/hourglass/hourglass52_mpii_256x256-ae358435_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hourglass_mpii.yml | https://link.springer.com/chapter/10.1007/978-3-319-46484-8_29 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_dark_mpii.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_dark_mpii.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_mpii_256x256_dark-0decd39f_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_dark_mpii.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_mpii_256x256_dark-f1601c5b_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_mpii_256x256-92cab7bd_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_mpii_256x256-6c4f923f_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_mpii.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w32_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w32_mpii_256x256_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w32_mpii_256x256_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w48_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w48_mpii_256x256_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/hrnet_w48_mpii_256x256_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/litehrnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet30_mpii_256x256-faae8bd8_20210622.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/litehrnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/litehrnet/litehrnet18_mpii_256x256-cabd7984_20210623.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/litehrnet_mpii.yml | https://arxiv.org/abs/2104.06403 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/mobilenetv2_mpii.yml | https://download.openmmlab.com/mmpose/top_down/mobilenetv2/mobilenetv2_mpii_256x256-e068afa7_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/mobilenetv2_mpii.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_256x256-418ffc88_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnet_mpii.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_mpii_256x256-3ecba29d_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_mpii_256x256-416f5d71_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnetv1d_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d50_mpii_256x256-2337a92e_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnetv1d_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d152_mpii_256x256-8b10a87c_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnetv1d_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnetv1d/resnetv1d101_mpii_256x256-2851d710_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnetv1d_mpii.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/He_Bag_of_Tricks_for_Image_Classification_with_Convolutional_Neural_Networks_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnext_mpii.yml | https://download.openmmlab.com/mmpose/top_down/resnext/resnext152_mpii_256x256-df302719_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/resnext_mpii.yml | http://openaccess.thecvf.com/content_cvpr_2017/html/Xie_Aggregated_Residual_Transformations_CVPR_2017_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet50_mpii_256x256-a54b6af5_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/scnet/scnet101_mpii_256x256-b4c2d184_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet_mpii.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Liu_Improving_Convolutional_Networks_With_Self-Calibrated_Convolutions_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet101_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/scnet50_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/seresnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet50_mpii_256x256-1bb21f79_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/seresnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet152_mpii_256x256-6ea1e774_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/seresnet_mpii.yml | https://download.openmmlab.com/mmpose/top_down/seresnet/seresnet101_mpii_256x256-0ba14ff5_20200927.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/seresnet_mpii.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Hu_Squeeze-and-Excitation_Networks_CVPR_2018_paper | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/shufflenetv1_mpii.yml | https://download.openmmlab.com/mmpose/top_down/shufflenetv1/shufflenetv1_mpii_256x256-dcc1c896_20200925.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/shufflenetv1_mpii.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Zhang_ShuffleNet_An_Extremely_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/shufflenetv2_mpii.yml | https://download.openmmlab.com/mmpose/top_down/shufflenetv2/shufflenetv2_mpii_256x256-4fb9df2d_20200925.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii/shufflenetv2_mpii.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Ningning_Light-weight_CNN_Architecture_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii_trb/resnet_mpii_trb.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii_trb/resnet_mpii_trb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_mpii_trb_256x256-896036b8_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii_trb/resnet_mpii_trb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_mpii_trb_256x256-dd369ce6_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/mpii_trb/resnet_mpii_trb.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_mpii_trb_256x256-cfad2f05_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288-d9f0d786_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_ochuman.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w32_ochuman_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w32_ochuman_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w48_ochuman_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/hrnet_w48_ochuman_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_384x288-e6f795e9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_384x288-3860d4c9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_256x192-f6e307c2_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_384x288-8c71bdc9_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/ochuman/resnet_ochuman.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_256x192-6e6babf0_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_posetrack18_384x288-5fd6d3ff_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_posetrack18_384x288-5fd6d3ff_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_posetrack18_256x192-b5d9b3f1_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_posetrack18_256x192-b5d9b3f1_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_posetrack18_384x288-806f00a3_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_posetrack18_384x288-806f00a3_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_posetrack18_256x192-1ee951c4_20201028.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_posetrack18_256x192-1ee951c4_20201028.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_posetrack18.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w32_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w32_posetrack18_384x288.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_384x288-d9f0d786_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w48_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_256x192-b9e0b3ab_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/hrnet_w48_posetrack18_384x288.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/res50_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/resnet_posetrack18.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/resnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_posetrack18_256x192-a62807c7_20201028.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_img/topdown_heatmap/posetrack18/resnet_posetrack18.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_posetrack18_256x192-a62807c7_20201028.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_posetrack18_posewarper.yml | https://download.openmmlab.com/mmpose/top_down/posewarper/hrnet_w48_posetrack18_384x288_posewarper_stage2-4abf88db_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_posetrack18_posewarper.yml | https://download.openmmlab.com/mmpose/top_down/posewarper/hrnet_w48_posetrack18_384x288_posewarper_stage2-4abf88db_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_posetrack18_posewarper.yml | https://arxiv.org/abs/1906.04016 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_w48_posetrack18_384x288_posewarper_stage1.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288-314c8528_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/2d_kpt_sview_rgb_vid/posewarper/posetrack18/hrnet_w48_posetrack18_384x288_posewarper_stage2.py | https://download.openmmlab.com/mmpose/top_down/posewarper/hrnet_w48_posetrack18_384x288_posewarper_stage1-08b632aa_20211130.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/campus/voxelpose_campus.yml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn64x64x64_cpn80x80x20_campus_cam3-d8decbf7_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/campus/voxelpose_campus.yml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn32x32x32_cpn80x80x20_campus_cam3-3ecee30e_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/campus/voxelpose_campus.yml | https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123460188.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/panoptic/voxelpose_prn64x64x64_cpn80x80x20_panoptic_cam5.yml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn64x64x64_cpn80x80x20_panoptic_cam5-545c150e_20211103.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/panoptic/voxelpose_prn64x64x64_cpn80x80x20_panoptic_cam5.yml | https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123460188.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/shelf/voxelpose_shelf.yml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn64x64x64_cpn80x80x20_shelf_cam5-f406fefe_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/shelf/voxelpose_shelf.yml | https://download.openmmlab.com/mmpose/body3d/voxelpose/voxelpose_prn32x32x32_cpn48x48x12_shelf_cam5-24721ec7_20220323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_mview_rgb_img/voxelpose/shelf/voxelpose_shelf.yml | https://www.ecva.net/papers/eccv_2020/papers_ECCV/papers/123460188.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_img/pose_lift/h36m/simplebaseline3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/simple_baseline/simple3Dbaseline_h36m-f0ad73a4_20210419.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_img/pose_lift/h36m/simplebaseline3d_h36m.yml | http://openaccess.thecvf.com/content_iccv_2017/html/Martinez_A_Simple_yet_ICCV_2017_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_img/pose_lift/mpi_inf_3dhp/simplebaseline3d_mpi-inf-3dhp.yml | https://download.openmmlab.com/mmpose/body3d/simple_baseline/simplebaseline3d_mpi-inf-3dhp-b75546f6_20210603.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_img/pose_lift/mpi_inf_3dhp/simplebaseline3d_mpi-inf-3dhp.yml | http://openaccess.thecvf.com/content_iccv_2017/html/Martinez_A_Simple_yet_ICCV_2017_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_81frames_fullconv_supervised-1f2d1104_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_27frames_fullconv_supervised-fe8fbba9_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_27frames_fullconv_semi-supervised-54aef83b_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_27frames_fullconv_semi-supervised_cpn_ft-71be9cde_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_243frames_fullconv_supervised-880bea25_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_243frames_fullconv_supervised_cpn_ft-88f5abbb_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_h36m_1frame_fullconv_supervised_cpn_ft-5c3afaed_20210527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/h36m/videopose3d_h36m.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Pavllo_3D_Human_Pose_Estimation_in_Video_With_Temporal_Convolutions_and_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/mpi_inf_3dhp/videopose3d_mpi-inf-3dhp.yml | https://download.openmmlab.com/mmpose/body3d/videopose/videopose_mpi-inf-3dhp_1frame_fullconv_supervised_gt-d6ed21ef_20210603.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_kpt_sview_rgb_vid/video_pose_lift/mpi_inf_3dhp/videopose3d_mpi-inf-3dhp.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Pavllo_3D_Human_Pose_Estimation_in_Video_With_Temporal_Convolutions_and_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_mesh_sview_rgb_img/hmr/mixed/resnet_mixed.yml | https://download.openmmlab.com/mmpose/mesh/hmr/hmr_mesh_224x224-c21e8229_20201015.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/body/3d_mesh_sview_rgb_img/hmr/mixed/resnet_mixed.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Kanazawa_End-to-End_Recovery_of_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/deeppose/wflw/resnet_softwingloss_wflw.yml | https://download.openmmlab.com/mmpose/face/deeppose/deeppose_res50_wflw_256x256_softwingloss-4d34f22a_20211212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/deeppose/wflw/resnet_softwingloss_wflw.yml | https://ieeexplore.ieee.org/document/9442331/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/deeppose/wflw/resnet_wflw.yml | https://download.openmmlab.com/mmpose/face/deeppose/deeppose_res50_wflw_256x256-92d0ba7f_20210303.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/deeppose/wflw/resnet_wflw.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/deeppose/wflw/resnet_wingloss_wflw.yml | https://download.openmmlab.com/mmpose/face/deeppose/deeppose_res50_wflw_256x256_wingloss-f82a5e53_20210303.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/deeppose/wflw/resnet_wingloss_wflw.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Feng_Wing_Loss_for_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/300w/hrnetv2_300w.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/300w/hrnetv2_300w.yml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_300w_256x256-eea53406_20211019.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/aflw/hrnetv2_aflw.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/aflw/hrnetv2_aflw.yml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_aflw_256x256-f2bbc62b_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/aflw/hrnetv2_dark_aflw.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/aflw/hrnetv2_dark_aflw.yml | https://download.openmmlab.com/mmpose/face/darkpose/hrnetv2_w18_aflw_256x256_dark-219606c0_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/hourglass_coco_wholebody_face.yml | https://download.openmmlab.com/mmpose/face/hourglass/hourglass52_coco_wholebody_face_256x256-6994cf2e_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/hourglass_coco_wholebody_face.yml | https://link.springer.com/chapter/10.1007/978-3-319-46484-8_29 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/hrnetv2_coco_wholebody_face.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/hrnetv2_coco_wholebody_face.yml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_coco_wholebody_face_256x256-c1ca469b_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/hrnetv2_dark_coco_wholebody_face.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/hrnetv2_dark_coco_wholebody_face.yml | https://download.openmmlab.com/mmpose/face/darkpose/hrnetv2_w18_coco_wholebody_face_256x256_dark-3d9a334e_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/mobilenetv2_coco_wholebody_face.yml | https://download.openmmlab.com/mmpose/face/mobilenetv2/mobilenetv2_coco_wholebody_face_256x256-4a3f096e_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/mobilenetv2_coco_wholebody_face.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/resnet_coco_wholebody_face.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/resnet_coco_wholebody_face.yml | https://download.openmmlab.com/mmpose/face/resnet/res50_coco_wholebody_face_256x256-5128edf5_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/scnet_coco_wholebody_face.yml | https://download.openmmlab.com/mmpose/face/scnet/scnet50_coco_wholebody_face_256x256-a0183f5f_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/scnet_coco_wholebody_face.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Liu_Improving_Convolutional_Networks_With_Self-Calibrated_Convolutions_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_face/scnet50_coco_wholebody_face_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/cofw/hrnetv2_cofw.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/cofw/hrnetv2_cofw.yml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_cofw_256x256-49243ab8_20211019.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/wflw/hrnetv2_awing_wflw.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/wflw/hrnetv2_awing_wflw.yml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_wflw_256x256_awing-5af5055c_20211212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/wflw/hrnetv2_dark_wflw.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/wflw/hrnetv2_dark_wflw.yml | https://download.openmmlab.com/mmpose/face/darkpose/hrnetv2_w18_wflw_256x256_dark-3f8e0c2c_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/wflw/hrnetv2_wflw.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/face/2d_kpt_sview_rgb_img/topdown_heatmap/wflw/hrnetv2_wflw.yml | https://download.openmmlab.com/mmpose/face/hrnetv2/hrnetv2_w18_wflw_256x256-2bf032a6_20210125.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/deeppose/deepfashion/resnet_deepfashion.yml | https://download.openmmlab.com/mmpose/fashion/deeppose/deeppose_res50_deepfashion_upper_256x192-497799fb_20210309.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/deeppose/deepfashion/resnet_deepfashion.yml | https://download.openmmlab.com/mmpose/fashion/deeppose/deeppose_res50_deepfashion_lower_256x192-94e0e653_20210309.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/deeppose/deepfashion/resnet_deepfashion.yml | https://download.openmmlab.com/mmpose/fashion/deeppose/deeppose_res50_deepfashion_full_256x192-4e0273e2_20210309.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/deeppose/deepfashion/resnet_deepfashion.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_full_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_full_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_lower_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_lower_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_upper_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w32_deepfashion_upper_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_full_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_full_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_lower_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_lower_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_upper_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/hrnet_w48_deepfashion_upper_256x192_udp.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/resnet_deepfashion.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/resnet_deepfashion.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion_upper_256x192-41794f03_20210124.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/resnet_deepfashion.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion_lower_256x192-1292a839_20210124.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion/resnet_deepfashion.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion_full_256x192-0dbd6e42_20210124.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_vest_dress_256x192-fb3fbd6f_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_vest_256x192-4c48d05c_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_trousers_256x192-3e632257_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_sling_dress_256x192-8ebae0eb_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_sling_256x192-ebb2b736_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_skirt_256x192-09573469_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_shorts_256x192-9ab23592_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_short_sleeved_shirt_256x192-21e1c5da_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_short_sleeved_outwear_256x192-a04c1298_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_short_sleeved_dress_256x192-1345b07a_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_long_sleeved_shirt_256x192-8679e7e3_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_long_sleeved_outwear_256x192-31fbaecf_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/fashion/2d_kpt_sview_rgb_img/topdown_heatmap/deepfashion2/resnet_deepfashion2.yml | https://download.openmmlab.com/mmpose/fashion/resnet/res50_deepfashion2_long_sleeved_dress_256x192-87bac74e_20221208.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/deeppose/onehand10k/resnet_onehand10k.yml | https://download.openmmlab.com/mmpose/hand/deeppose/deeppose_res50_onehand10k_256x256-cbddf43a_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/deeppose/onehand10k/resnet_onehand10k.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/deeppose/panoptic2d/resnet_panoptic2d.yml | https://download.openmmlab.com/mmpose/hand/deeppose/deeppose_res50_panoptic_256x256-8a745183_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/deeppose/panoptic2d/resnet_panoptic2d.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/deeppose/rhd2d/resnet_rhd2d.yml | https://download.openmmlab.com/mmpose/hand/deeppose/deeppose_res50_rhd2d_256x256-37f1c4d3_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/deeppose/rhd2d/resnet_rhd2d.yml | http://openaccess.thecvf.com/content_cvpr_2016/html/He_Deep_Residual_Learning_CVPR_2016_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/hourglass_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/hourglass/hourglass52_coco_wholebody_hand_256x256-7b05c6db_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/hourglass_coco_wholebody_hand.yml | https://link.springer.com/chapter/10.1007/978-3-319-46484-8_29 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/hrnetv2_coco_wholebody_hand.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/hrnetv2_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/hrnetv2/hrnetv2_w18_coco_wholebody_hand_256x256-1c028db7_20210908.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/hrnetv2_dark_coco_wholebody_hand.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/hrnetv2_dark_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_coco_wholebody_hand_256x256_dark-a9228c9c_20210908.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/litehrnet_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/litehrnet/litehrnet_w18_coco_wholebody_hand_256x256-d6945e6a_20210908.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/litehrnet_coco_wholebody_hand.yml | https://arxiv.org/abs/2104.06403 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/mobilenetv2_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/mobilenetv2/mobilenetv2_coco_wholebody_hand_256x256-06b8c877_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/mobilenetv2_coco_wholebody_hand.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/resnet_coco_wholebody_hand.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/resnet_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_coco_wholebody_hand_256x256-8dbc750c_20210908.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/scnet_coco_wholebody_hand.yml | https://download.openmmlab.com/mmpose/hand/scnet/scnet50_coco_wholebody_hand_256x256-e73414c7_20210909.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/scnet_coco_wholebody_hand.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Liu_Improving_Convolutional_Networks_With_Self-Calibrated_Convolutions_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/coco_wholebody_hand/scnet50_coco_wholebody_hand_256x256.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/freihand2d/resnet_freihand2d.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/freihand2d/resnet_freihand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_freihand_224x224-ff0799bc_20200914.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/freihand2d/resnet_freihand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_freihand_224x224-ff0799bc_20200914.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_machine-8f3efe9a_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_machine-8f3efe9a_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_machine-8f3efe9a_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_machine-8f3efe9a_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_human-77b27d1a_20201029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_human-77b27d1a_20201029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_human-77b27d1a_20201029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_human-77b27d1a_20201029.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_all-78cc95d4_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_all-78cc95d4_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_all-78cc95d4_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/interhand2d/resnet_interhand2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_interhand2d_256x256_all-78cc95d4_20201102.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/hrnetv2_dark_onehand10k.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/hrnetv2_dark_onehand10k.yml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_onehand10k_256x256_dark-a2f80c64_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/hrnetv2_onehand10k.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/hrnetv2_onehand10k.yml | https://download.openmmlab.com/mmpose/hand/hrnetv2/hrnetv2_w18_onehand10k_256x256-30bc9c6b_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/hrnetv2_udp_onehand10k.yml | https://download.openmmlab.com/mmpose/hand/udp/hrnetv2_w18_onehand10k_256x256_udp-0d1b515d_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/hrnetv2_udp_onehand10k.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Huang_The_Devil_Is_in_the_Details_Delving_Into_Unbiased_Data_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/mobilenetv2_onehand10k.yml | https://download.openmmlab.com/mmpose/hand/mobilenetv2/mobilenetv2_onehand10k_256x256-f3a3d90e_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/mobilenetv2_onehand10k.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/resnet_onehand10k.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/onehand10k/resnet_onehand10k.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_onehand10k_256x256-739c8639_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/hrnetv2_dark_panoptic2d.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/hrnetv2_dark_panoptic2d.yml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_panoptic_256x256_dark-1f1e4b74_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/hrnetv2_panoptic2d.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/hrnetv2_panoptic2d.yml | https://download.openmmlab.com/mmpose/hand/hrnetv2/hrnetv2_w18_panoptic_256x256-53b12345_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/hrnetv2_udp_panoptic2d.yml | https://download.openmmlab.com/mmpose/hand/udp/hrnetv2_w18_panoptic_256x256_udp-f9e15948_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/hrnetv2_udp_panoptic2d.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Huang_The_Devil_Is_in_the_Details_Delving_Into_Unbiased_Data_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/mobilenetv2_panoptic2d.yml | https://download.openmmlab.com/mmpose/hand/mobilenetv2/mobilenetv2_panoptic_256x256-b733d98c_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/mobilenetv2_panoptic2d.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/resnet_panoptic2d.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/panoptic2d/resnet_panoptic2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_panoptic_256x256-4eafc561_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/hrnetv2_dark_rhd2d.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/hrnetv2_dark_rhd2d.yml | https://download.openmmlab.com/mmpose/hand/dark/hrnetv2_w18_rhd2d_256x256_dark-4df3a347_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/hrnetv2_rhd2d.yml | https://ieeexplore.ieee.org/abstract/document/9052469/ | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/hrnetv2_rhd2d.yml | https://download.openmmlab.com/mmpose/hand/hrnetv2/hrnetv2_w18_rhd2d_256x256-95b20dd8_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/hrnetv2_udp_rhd2d.yml | https://download.openmmlab.com/mmpose/hand/udp/hrnetv2_w18_rhd2d_256x256_udp-63ba6007_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/hrnetv2_udp_rhd2d.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Huang_The_Devil_Is_in_the_Details_Delving_Into_Unbiased_Data_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/mobilenetv2_rhd2d.yml | https://download.openmmlab.com/mmpose/hand/mobilenetv2/mobilenetv2_rhd2d_256x256-85fa02db_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/mobilenetv2_rhd2d.yml | http://openaccess.thecvf.com/content_cvpr_2018/html/Sandler_MobileNetV2_Inverted_Residuals_CVPR_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/resnet_rhd2d.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/2d_kpt_sview_rgb_img/topdown_heatmap/rhd2d/resnet_rhd2d.yml | https://download.openmmlab.com/mmpose/hand/resnet/res50_rhd2d_256x256-5dc7e4cc_20210330.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/3d_kpt_sview_rgb_img/internet/interhand3d/internet_interhand3d.yml | https://download.openmmlab.com/mmpose/hand3d/internet/res50_intehand3dv1.0_all_256x256-42b7f2ac_20210702.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/3d_kpt_sview_rgb_img/internet/interhand3d/internet_interhand3d.yml | https://download.openmmlab.com/mmpose/hand3d/internet/res50_intehand3dv1.0_all_256x256-42b7f2ac_20210702.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/3d_kpt_sview_rgb_img/internet/interhand3d/internet_interhand3d.yml | https://link.springer.com/content/pdf/10.1007/978-3-030-58565-5_33.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/gesture_sview_rgbd_vid/mtut/nvgesture/i3d_nvgesture.yml | https://download.openmmlab.com/mmpose/gesture/mtut/i3d_nvgesture_bbox_224x224_fps30-98a8f288_20220530.pthh | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/gesture_sview_rgbd_vid/mtut/nvgesture/i3d_nvgesture.yml | https://download.openmmlab.com/mmpose/gesture/mtut/i3d_nvgesture_bbox_112x112_fps15-363b5956_20220530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/gesture_sview_rgbd_vid/mtut/nvgesture/i3d_nvgesture.yml | https://download.openmmlab.com/mmpose/gesture/mtut/i3d_nvgesture_224x224_fps30-b7abf574_20220530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/hand/gesture_sview_rgbd_vid/mtut/nvgesture/i3d_nvgesture.yml | https://openaccess.thecvf.com/content_CVPR_2019/html/Abavisani_Improving_the_Performance_of_Unimodal_Dynamic_Hand-Gesture_Recognition_With_Multimodal_CVPR_2019_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet48_coco_wholebody_512x512_plus-934f08aa_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/bottom_up/higher_hrnet32_coco_wholebody_512x512_plus-2fa137ab_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_coco-wholebody.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Cheng_HigherHRNet_Scale-Aware_Representation_Learning_for_Bottom-Up_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w32_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w32_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w48_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/higherhrnet_w48_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w48_coco_wholebody_512x512_plus-4de8a695_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/bottom_up/hrnet_w32_coco_wholebody_512x512_plus-f1f1185c_20210517.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_coco-wholebody.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w32_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w32_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w48_coco_wholebody_512x512.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/associative_embedding/coco-wholebody/hrnet_w48_coco_wholebody_640x640.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_wholebody_384x288-6e061c6a_20200922.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_wholebody_256x192-643e18cb_20200922.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_wholebody_384x288-78cacac3_20200922.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_wholebody_256x192-853765cd_20200918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_coco-wholebody.yml | http://openaccess.thecvf.com/content_CVPR_2019/html/Sun_Deep_High-Resolution_Representation_Learning_for_Human_Pose_Estimation_CVPR_2019_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_dark_coco-wholebody.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_dark_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_wholebody_384x288_dark-f5726563_20200918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_dark_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_wholebody_256x192_dark-469327ef_20200922.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w32_coco_wholebody_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_384x288.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/hrnet_w48_coco_wholebody_384x288_dark_plus.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288_dark-741844ba_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | http://openaccess.thecvf.com/content_ECCV_2018/html/Bin_Xiao_Simple_Baselines_for_ECCV_2018_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_wholebody_384x288-ce11e294_20201004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_wholebody_256x192-9e37ed88_20201004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_wholebody_384x288-eab8caa8_20201004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res152_coco_wholebody_256x192-5de8ae23_20201004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_wholebody_384x288-6c137b9a_20201004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/resnet_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/resnet/res101_coco_wholebody_256x192-7325f982_20201004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/tcformer_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/tcformer_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/tcformer/tcformer_coco-wholebody_256x192-a0720efa_20220627.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/tcformer_coco-wholebody.yml | https://openaccess.thecvf.com/content/CVPR2022/html/Zeng_Not_All_Tokens_Are_Equal_Human-Centric_Visual_Analysis_via_Token_CVPR_2022_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/vipnas_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_res50_wholebody_256x192-49e1c3a4_20211112.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/vipnas_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_mbv3_coco_wholebody_256x192-0fee581a_20211205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/vipnas_coco-wholebody.yml | https://arxiv.org/abs/2105.10154 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/vipnas_dark_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_res50_wholebody_256x192_dark-67c0ce35_20211112.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/vipnas_dark_coco-wholebody.yml | https://download.openmmlab.com/mmpose/top_down/vipnas/vipnas_mbv3_coco_wholebody_256x192_dark-e2158108_20211205.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/coco-wholebody/vipnas_dark_coco-wholebody.yml | https://arxiv.org/abs/2105.10154 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/halpe/hrnet_dark_halpe.yml | http://openaccess.thecvf.com/content_CVPR_2020/html/Zhang_Distribution-Aware_Coordinate_Representation_for_Human_Pose_Estimation_CVPR_2020_paper.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/halpe/hrnet_dark_halpe.yml | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_halpe_384x288_dark_plus-d13c2588_20211021.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/halpe/hrnet_w32_halpe_256x192.py | https://download.openmmlab.com/mmpose/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/configs/wholebody/2d_kpt_sview_rgb_img/topdown_heatmap/halpe/hrnet_w48_halpe_384x288_dark_plus.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w48_coco_384x288_dark-741844ba_20200812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/mmdetection_cfg/ssdlite_mobilenetv2_scratch_600e_onehand.py | https://download.openmmlab.com/mmdetection/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py | https://download.openmmlab.com/mmtracking/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/deepsort_faster-rcnn_fpn_4e_mot17-private-half.py | https://download.openmmlab.com/mmtracking/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/tracktor_faster-rcnn_r50_fpn_4e_mot17-private.py | https://download.openmmlab.com/mmtracking/mot/ | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/mmtracking_cfg/tracktor_faster-rcnn_r50_fpn_4e_mot17-private.py | https://download.openmmlab.com/mmtracking/ | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/gesture_recognition.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/gesture_recognition.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_estimation.py | https://download.openmmlab.com/mmpose/top_down/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_estimation.py | https://download.openmmlab.com/mmpose/animal/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_estimation.py | https://download.openmmlab.com | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_tracking.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/demo/webcam_cfg/pose_tracking.py | https://download.openmmlab.com | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 公钥链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/docker/serve/Dockerfile_mmcls | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/mmpose/apis/webcam/nodes/visualizer_nodes/sunglasses_effect_node.py | https://user-images.githubusercontent.com/15977946/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/pose_estimation/HRNet_MMPose_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/public_address_statement.md index 0192e9008d7f759369ca869b2bb778c9c4db2bd6..a1495f9b1839a75a8c2496c892d7b6ad634aeb2f 100644 --- a/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/public_address_statement.md @@ -1,170 +1,723 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/deeplabv3plus/README.md | BiseNetV1_for_PyTorch/configs/_base_/datasets/isaid.py | https://arxiv.org/pdf/2103.06564.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/_base_/models/fpn_poolformer_s12.py | BiseNetV1_for_PyTorch/configs/_base_/models/fpn_poolformer_s12.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s12_3rdparty_32xb128_in1k_20220414-f8d83051.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/_base_/models/segmenter_vit-b16_mask.py | BiseNetV1_for_PyTorch/configs/_base_/models/segmenter_vit-b16_mask.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_base_p16_384_20220308-96dfe169.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/_base_/models/twins_pcpvt-s_fpn.py | BiseNetV1_for_PyTorch/configs/_base_/models/twins_pcpvt-s_fpn.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/_base_/models/twins_pcpvt-s_fpn.py | BiseNetV1_for_PyTorch/configs/_base_/models/twins_pcpvt-s_upernet.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/README.md | BiseNetV1_for_PyTorch/configs/_base_/models/upernet_convnext.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_32xb128-noema_in1k_20220301-2a0ee547.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/transforms/loading.py | BiseNetV1_for_PyTorch/mmseg/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmsegmentation/pull/1445/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/bisenetv1/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/bisenetv1.py | https://arxiv.org/abs/1808.00897 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/beit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/beit.py | https://github.com/microsoft/unilm/blob/master/beit/semantic_segmentation/mmcv_custom/checkpoint.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/beit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/beit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/cgnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/bisenetv2/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/bisenetv2.py | https://arxiv.org/abs/2004.02147 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/transforms/transforms.py | BiseNetV1_for_PyTorch/mmseg/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/erfnet.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/erfnet.py | https://ieeexplore.ieee.org/document/8063438 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/fastscnn/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/fast_scnn.py | https://arxiv.org/abs/1902.04502 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/icnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/icnet.py | https://arxiv.org/abs/1704.08545 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/hrnet.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mobilenet_v2/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/mobilenet_v2.py | https://arxiv.org/abs/1801.04381 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mae.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/mae.py | https://github.com/microsoft/unilm/blob/master/beit/modeling_pretrain.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/mit.py | https://github.com/open-mmlab/mmcv/pull/1418 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/beit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/mit.py | https://github.com/pytorch/pytorch/issues/37583 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/mit.py | https://arxiv.org/abs/2105.15203 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mobilenet_v3.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/resnest/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/resnest.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnext.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/resnext.py | https://arxiv.org/abs/1611.05431 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/README.md | BiseNetV1_for_PyTorch/mmseg/models/backbones/stdc.py | https://github.com/MichaelFan01/STDC-Seg | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/dpt/README.md | BiseNetV1_for_PyTorch/mmseg/models/backbones/timm_backbone.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnet.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/resnet.py | https://arxiv.org/abs/1512.03385 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/stdc.py | https://arxiv.org/abs/2104.13188 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/swin.py | https://arxiv.org/abs/2103.14030 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/README.md | BiseNetV1_for_PyTorch/mmseg/models/backbones/swin.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/twins.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/twins.py | https://arxiv.org/abs/2102.10882 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnet.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/resnet.py | https://arxiv.org/abs/1812.01187 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnet.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/twins.py | https://arxiv.org/abs/1512.03385 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/unet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/backbones/unet.py | https://arxiv.org/abs/1505.04597 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/vit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/vit.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/beit.py | BiseNetV1_for_PyTorch/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/apc_head.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/deeplabv3/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/ann/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/ccnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/danet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/dmnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/dm_head.py | https://openaccess.thecvf.com/content_ICCV_2019/papers/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/dnlnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/encnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/emanet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/fcn/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/dpt/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/dpt_head.py | https://arxiv.org/abs/2103.13413 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/sem_fpn/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/gcnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/isanet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/isa_head.py | https://arxiv.org/abs/1907.12273 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mobilenet_v3.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/nonlocal_net/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/point_head.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/psa_head.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/psa_head.py | https://hszhao.github.io/papers/eccv18_psanet.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/point_rend/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/point_head.py | https://arxiv.org/abs/1912.08193 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/ocrnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/pspnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/knet_head.py | https://arxiv.org/abs/2106.14855 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/segmenter_mask_head.py | https://arxiv.org/abs/2105.05633 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/segformer_head.py | https://arxiv.org/abs/2105.15203 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/deeplabv3plus/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/fastscnn/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/sep_fcn_head.py | https://arxiv.org/abs/1902.04502 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/setr_mla_head.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/setr_mla_head.py | https://arxiv.org/pdf/2012.15840.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/setr_mla_head.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/setr_up_head.py | https://arxiv.org/pdf/2012.15840.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/stdc_head.py | https://arxiv.org/abs/2104.13188 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/uper_head.py | BiseNetV1_for_PyTorch/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/cross_entropy_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/cross_entropy_loss.py | https://github.com/pytorch/pytorch/blob/56b43f4fec1f76953f15a627694d4bba34588969/torch/nn/functional.py#L2660 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/dice_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/dice_loss.py | https://arxiv.org/abs/1606.04797 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mask2former/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/losses/focal_loss.py | https://github.com/open-mmlab/mmdetection | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/lovasz_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/lovasz_loss.py | https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytor | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/focal_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/tversky_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/tversky_loss.py | https://github.com/JunMa11/SegLoss/blob/master/losses_pytorch/dice_loss.py#L333 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/tversky_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/tversky_loss.py | https://arxiv.org/abs/1706.05721 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/lovasz_loss.py | BiseNetV1_for_PyTorch/mmseg/models/losses/lovasz_loss.py | https://arxiv.org/abs/1705.08790 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/necks/fpn.py | BiseNetV1_for_PyTorch/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/fastfcn/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/necks/jpu.py | https://arxiv.org/abs/1903.11816 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/icnet/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/necks/ic_neck.py | https://arxiv.org/abs/1704.08545 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/setr/metafile.yaml | BiseNetV1_for_PyTorch/mmseg/models/necks/mla_neck.py | https://arxiv.org/abs/2012.15840 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/basesegdataset.py | BiseNetV1_for_PyTorch/mmseg/models/segmentors/base.py | https://github.com/open-mmlab/mmdetection/issues/5844 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/utils/embed.py | BiseNetV1_for_PyTorch/mmseg/models/utils/embed.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/utils/make_divisible.py | BiseNetV1_for_PyTorch/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/utils/self_attention_block.py | BiseNetV1_for_PyTorch/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tests/test_models/test_backbones/test_blocks.py | BiseNetV1_for_PyTorch/tests/test_models/test_backbones/test_blocks.py | https://github.com/open-mmlab/mmcv/pull/1709 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/structures/sampler/ohem_pixel_sampler.py | BiseNetV1_for_PyTorch/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.dev_scripts/benchmark_inference.py | BiseNetV1_for_PyTorch/.dev/benchmark_inference.py | https://download.openmmlab.com/mmsegmentation/v0.5/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.dev_scripts/benchmark_inference.py | BiseNetV1_for_PyTorch/.dev/check_urls.py | https://download.openmmlab.com/mmsegmentation/v0.5/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.dev_scripts/upload_modelzoo.py | BiseNetV1_for_PyTorch/.dev/upload_modelzoo.py | https://oss-accelerate.aliyuncs.com | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.dev_scripts/benchmark_inference.py | BiseNetV1_for_PyTorch/.dev/md2yml.py | https://download.openmmlab.com | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/docker/Dockerfile | BiseNetV1_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/docker/Dockerfile | BiseNetV1_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/get_started.md | BiseNetV1_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/demo/MMSegmentation_Tutorial.ipynb | BiseNetV1_for_PyTorch/docker/Dockerfile | https://github.com/open-mmlab/mmsegmentation.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml | BiseNetV1_for_PyTorch/tools/analyze_logs.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/user_guides/5_deployment.md | BiseNetV1_for_PyTorch/tools/deploy_test.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/user_guides/5_deployment.md | BiseNetV1_for_PyTorch/tools/onnx2tensorrt.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/user_guides/5_deployment.md | BiseNetV1_for_PyTorch/tools/pytorch2onnx.py | https://github.com/open-mmlab/mmdeploy | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.dev_scripts/log_collector/utils.py | BiseNetV1_for_PyTorch/.dev/log_collector/utils.py | https://github.dev/open-mmlab/mmcv | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/convnext-base_upernet_8xb2-amp-160k_ade20k-640x640.py | BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_in21k_20220301-262fd037.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/convnext-large_upernet_8xb2-amp-160k_ade20k-640x640.py | BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-large_3rdparty_in21k_20220301-e6e0ea0a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/convnext-small_upernet_8xb2-amp-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/convnext-tiny_upernet_8xb2-amp-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/convnext-xlarge_upernet_8xb2-amp-160k_ade20k-640x640.py | BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-xlarge_3rdparty_in21k_20220301-08aa5ddc.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/knet-s3_swin-l_upernet_8xb2-adamw-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/knet-s3_swin-l_upernet_8xb2-adamw-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/knet-s3_swin-t_upernet_8xb2-adamw-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220308-f41b89d3.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/poolformer/fpn_poolformer_m36_8xb4-40k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_m36_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m36_3rdparty_32xb128_in1k_20220414-c55e0949.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/poolformer/fpn_poolformer_m48_8xb4-40k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_m48_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m48_3rdparty_32xb128_in1k_20220414-9378f3eb.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/poolformer/fpn_poolformer_s24_8xb4-40k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_s24_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s24_3rdparty_32xb128_in1k_20220414-d7055904.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/poolformer/fpn_poolformer_s36_8x4_512x512_40k_ade20k.py | BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_s36_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s36_3rdparty_32xb128_in1k_20220414-d78ff3e8.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/pspnet/pspnet_r50-d32_rsb_4xb2-adamw-80k_cityscapes-512x1024.py | BiseNetV1_for_PyTorch/configs/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/pspnet/pspnet_r50-d32_rsb_4xb2-adamw-80k_cityscapes-512x1024.py | BiseNetV1_for_PyTorch/configs/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b0_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b0_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b0_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b1_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b1_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b2_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b2_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b3_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b3_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b3_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b4_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b4_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b4_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b5_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b5_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b5_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b5_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b5_8xb1-160k_cityscapes-1024x1024.py | BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/segmenter_vit-l_mask_8xb1-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_large_p16_384_20220308-d4efb41d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/segmenter_vit-s_mask_8xb1-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_small_p16_384_20220308-410f6037.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/segmenter_vit-t_mask_8xb1-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_tiny_p16_384_20220308-cce8c795.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/stdc1_in1k-pre_4xb12-80k_cityscapes-512x1024.py | BiseNetV1_for_PyTorch/configs/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc1_20220308-5368626c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/stdc2_in1k-pre_4xb12-80k_cityscapes-512x1024.py | BiseNetV1_for_PyTorch/configs/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc2_20220308-7dbd9127.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mask2former/mask2former_swin-b-in1k-384x384-pre_8xb2-160k_ade20k-640x640.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_20220317-55b0104a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mask2former/mask2former_swin-b-in22k-384x384-pre_8xb2-160k_ade20k-640x640.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_22k_20220317-e5c09f74.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/swin-base-patch4-window7-in1k-pre_upernet_8xb2-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_20220317-e9b98025.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/swin-base-patch4-window7-in22k-pre_upernet_8xb2-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_22k_20220317-4f79f7c0.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mask2former/mask2former_swin-l-in22k-384x384-pre_8xb2-160k_ade20k-640x640.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_large_patch4_window12_512x512_pretrain_384x384_22K_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window12_384_22k_20220412-6580f57d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/swin-large-patch4-window7-in22k-pre_upernet_8xb2-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_large_patch4_window7_512x512_pretrain_224x224_22K_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220412-aeecf2aa.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mask2former/mask2former_swin-s_8xb2-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_small_patch4_window7_224_20220317-7ba6d6dd.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/mask2former/mask2former_swin-t_8xb2-160k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220317-1cdeb081.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-b_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-b_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-l_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-l_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-b_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-b_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-l_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-l_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-s_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-s_fpn_fpnhead_8xb4-80k_ade20k-512x512.py | BiseNetV1_for_PyTorch/configs/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/docker/Dockerfile | BiseNetV1_for_PyTorch/docker/serve/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/docker/Dockerfile | BiseNetV1_for_PyTorch/docker/serve/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/get_started.md | BiseNetV1_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/conf.py | BiseNetV1_for_PyTorch/docs/en/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/dsdl/README.md | BiseNetV1_for_PyTorch/docs/en/conf.py | https://mmsegmentation.readthedocs.io/en/latest/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/emanet/README.md | BiseNetV1_for_PyTorch/docs/en/conf.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/CITATION.cff | BiseNetV1_for_PyTorch/docs/en/conf.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/advanced_guides/transforms.md | BiseNetV1_for_PyTorch/docs/en/conf.py | https://github.com/open-mmlab/mmcv | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/emanet/README.md | BiseNetV1_for_PyTorch/docs/en/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/conf.py | BiseNetV1_for_PyTorch/docs/zh_cn/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/zh_cn/conf.py | BiseNetV1_for_PyTorch/docs/zh_cn/conf.py | https://mmsegmentation.readthedocs.io/zh-CN/latest/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/emanet/README.md | BiseNetV1_for_PyTorch/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/CITATION.cff | BiseNetV1_for_PyTorch/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/advanced_guides/transforms.md | BiseNetV1_for_PyTorch/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmcv | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/emanet/README.md | BiseNetV1_for_PyTorch/docs/zh_cn/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/make.bat | BiseNetV1_for_PyTorch/docs/en/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/basesegdataset.py | BiseNetV1_for_PyTorch/mmseg/datasets/custom.py | https://github.com/open-mmlab/mmdetection/issues/5844 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/make.bat | BiseNetV1_for_PyTorch/docs/zh_cn/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开发引入 | / | BiseNetV1_for_PyTorch/requirements/docs.txt | https://github.com/gaotongxiao/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.circleci/config.yml | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.circleci/config.yml | https://download.openmmlab.com/mmcv/dist/cpu/torch<< parameters.torch >>/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.dev/benchmark_inference.py | https://download.openmmlab.com/mmsegmentation/v0.5/{model_name}/{config_name}/{checkpoint_name} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.dev/check_urls.py | https://download.openmmlab.com/mmsegmentation/v0.5/{model_name}/{config_name}/{config_name}-{model_time}.log.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.dev/check_urls.py | https://download.openmmlab.com/mmsegmentation/v0.5/{model_name}/{config_name}/{config_name}_{model_time}.log.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.dev/md2yml.py | https://download.openmmlab.com | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.dev/upload_modelzoo.py | https://oss-accelerate.aliyuncs.com | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/ISSUE_TEMPLATE/config.yml | https://mmsegmentation.readthedocs.io | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/lts/1.8/torch_lts.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cu102/${{matrix.torch_version}}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cu101/${{matrix.torch_version}}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cpu/torch1.8/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/cpu/${{matrix.torch_version}}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/_base_/models/fpn_poolformer_s12.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s12_3rdparty_32xb128_in1k_20220414-f8d83051.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/_base_/models/segmenter_vit-b16_mask.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_base_p16_384_20220308-96dfe169.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/_base_/models/twins_pcpvt-s_fpn.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/_base_/models/twins_pcpvt-s_upernet.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/_base_/models/upernet_convnext.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_32xb128-noema_in1k_20220301-2a0ee547.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_80k_cityscapes/ann_r50-d8_769x769_80k_cityscapes_20200607_044426-cc7ff323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_40k_cityscapes/ann_r50-d8_769x769_40k_cityscapes_20200530_025712-2b46b04d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_80k_ade20k/ann_r50-d8_512x512_80k_ade20k_20200615_014818-26f75e11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_40k_voc12aug/ann_r50-d8_512x512_40k_voc12aug_20200613_231314-b5dac322.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_20k_voc12aug/ann_r50-d8_512x512_20k_voc12aug_20200617_222246-dfcb1c62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_160k_ade20k/ann_r50-d8_512x512_160k_ade20k_20200615_231733-892247bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_80k_cityscapes/ann_r50-d8_512x1024_80k_cityscapes_20200607_101911-5a9ad545.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_40k_cityscapes/ann_r50-d8_512x1024_40k_cityscapes_20200605_095211-049fc292.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_80k_cityscapes/ann_r101-d8_769x769_80k_cityscapes_20200607_013713-a9d4be8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_40k_cityscapes/ann_r101-d8_769x769_40k_cityscapes_20200530_025720-059bff28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_80k_ade20k/ann_r101-d8_512x512_80k_ade20k_20200615_014818-c0153543.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_40k_voc12aug/ann_r101-d8_512x512_40k_voc12aug_20200613_231314-bd205bbe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_20k_voc12aug/ann_r101-d8_512x512_20k_voc12aug_20200617_222246-2fad0042.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_160k_ade20k/ann_r101-d8_512x512_160k_ade20k_20200615_231733-955eb1ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_80k_cityscapes/ann_r101-d8_512x1024_80k_cityscapes_20200607_013728-aceccc6e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_40k_cityscapes/ann_r101-d8_512x1024_40k_cityscapes_20200605_095243-adf6eece.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ann/ann.yml | https://arxiv.org/abs/1908.07678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_80k_cityscapes/apcnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_40k_cityscapes/apcnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_80k_ade20k/apcnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_160k_ade20k/apcnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_80k_cityscapes/apcnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_40k_cityscapes/apcnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_80k_cityscapes/apcnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_40k_cityscapes/apcnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_80k_ade20k/apcnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_160k_ade20k/apcnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_80k_cityscapes/apcnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_40k_cityscapes/apcnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/apcnet/apcnet.yml | https://openaccess.thecvf.com/content_CVPR_2019/html/He_Adaptive_Pyramid_Context_Network_for_Semantic_Segmentation_CVPR_2019_paper.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/beit/beit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/beit/upernet_beit-large_fp16_8x1_640x640_160k_ade20k/upernet_beit-large_fp16_8x1_640x640_160k_ade20k-8fc0dd5d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/beit/beit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/beit/upernet_beit-base_8x2_640x640_160k_ade20k/upernet_beit-base_8x2_640x640_160k_ade20k-eead221d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_040616-d2bb0df4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_181932-66747911.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210917_234628-8b304447.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes_20210923_222639-7b28a2a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211022_054328-046aa2f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211023_013100-f700dbf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes_20210905_220322-bb8db75f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210905_220251-8ba80eff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes_20210922_172239-c55e78e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211102_164147-c6b32c3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_225220-28c8f092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv1/bisenetv1.yml | https://arxiv.org/abs/1808.00897 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes_20220808_172324-8bf0aaba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes_20210902_045942-b979777b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_4x8_1024x1024_160k_cityscapes/bisenetv2_fcn_4x8_1024x1024_160k_cityscapes_20210903_000032-e1a2eed6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_4x4_1024x1024_160k_cityscapes_20210902_015551-bcf10f09.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/bisenetv2/bisenetv2.yml | https://arxiv.org/abs/2004.02147 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_80k_cityscapes/ccnet_r50-d8_769x769_80k_cityscapes_20200617_010421-73eed8ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_40k_cityscapes/ccnet_r50-d8_769x769_40k_cityscapes_20200616_145125-76d11884.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_80k_ade20k/ccnet_r50-d8_512x512_80k_ade20k_20200615_014848-aa37f61e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_40k_voc12aug/ccnet_r50-d8_512x512_40k_voc12aug_20200613_232127-c2a15f02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_20k_voc12aug/ccnet_r50-d8_512x512_20k_voc12aug_20200617_193212-fad81784.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_160k_ade20k/ccnet_r50-d8_512x512_160k_ade20k_20200616_084435-7c97193b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_80k_cityscapes/ccnet_r50-d8_512x1024_80k_cityscapes_20200617_010421-869a3423.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_40k_cityscapes/ccnet_r50-d8_512x1024_40k_cityscapes_20200616_142517-4123f401.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_80k_cityscapes/ccnet_r101-d8_769x769_80k_cityscapes_20200618_011502-ad3cd481.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_40k_cityscapes/ccnet_r101-d8_769x769_40k_cityscapes_20200617_101428-4f57c8d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_80k_ade20k/ccnet_r101-d8_512x512_80k_ade20k_20200615_014848-1f4929a3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_40k_voc12aug/ccnet_r101-d8_512x512_40k_voc12aug_20200613_232127-c30da577.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_20k_voc12aug/ccnet_r101-d8_512x512_20k_voc12aug_20200617_193212-0007b61d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_160k_ade20k/ccnet_r101-d8_512x512_160k_ade20k_20200616_000644-e849e007.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_80k_cityscapes/ccnet_r101-d8_512x1024_80k_cityscapes_20200617_203935-ffae8917.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_40k_cityscapes/ccnet_r101-d8_512x1024_40k_cityscapes_20200616_142540-a3b84ba6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ccnet/ccnet.yml | https://arxiv.org/abs/1811.11721 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/cgnet/cgnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_680x680_60k_cityscapes/cgnet_680x680_60k_cityscapes_20201101_110253-4c0b2f2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/cgnet/cgnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_512x1024_60k_cityscapes/cgnet_512x1024_60k_cityscapes_20201101_110254-124ea03b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/cgnet/cgnet.yml | https://arxiv.org/abs/1811.08201 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k/upernet_convnext_xlarge_fp16_640x640_160k_ade20k_20220226_080344-95fc38c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k/upernet_convnext_tiny_fp16_512x512_160k_ade20k_20220227_124553-cad485de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k/upernet_convnext_small_fp16_512x512_160k_ade20k_20220227_131208-1b1e394f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k/upernet_convnext_large_fp16_640x640_160k_ade20k_20220226_040532-e57aa54d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k/upernet_convnext_base_fp16_640x640_160k_ade20k_20220227_182859-9280e39b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_base_fp16_512x512_160k_ade20k/upernet_convnext_base_fp16_512x512_160k_ade20k_20220227_181227-02a24fc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_in21k_20220301-262fd037.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-large_3rdparty_in21k_20220301-e6e0ea0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-xlarge_3rdparty_in21k_20220301-08aa5ddc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_80k_cityscapes/danet_r50-d8_769x769_80k_cityscapes_20200607_132954-495689b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_40k_cityscapes/danet_r50-d8_769x769_40k_cityscapes_20200530_025703-76681c60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_80k_ade20k/danet_r50-d8_512x512_80k_ade20k_20200615_015125-edb18e08.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_40k_voc12aug/danet_r50-d8_512x512_40k_voc12aug_20200613_235526-426e3a64.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_20k_voc12aug/danet_r50-d8_512x512_20k_voc12aug_20200618_070026-9e9e3ab3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_160k_ade20k/danet_r50-d8_512x512_160k_ade20k_20200616_082340-9cb35dcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_80k_cityscapes/danet_r50-d8_512x1024_80k_cityscapes_20200607_133029-2bfa2293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_40k_cityscapes/danet_r50-d8_512x1024_40k_cityscapes_20200605_191324-c0dbfa5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_80k_cityscapes/danet_r101-d8_769x769_80k_cityscapes_20200607_132918-f3a929e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_40k_cityscapes/danet_r101-d8_769x769_40k_cityscapes_20200530_025717-dcb7fd4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_80k_ade20k/danet_r101-d8_512x512_80k_ade20k_20200615_015126-d0357c73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_40k_voc12aug/danet_r101-d8_512x512_40k_voc12aug_20200613_223031-788e232a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_20k_voc12aug/danet_r101-d8_512x512_20k_voc12aug_20200618_070026-d48d23b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_160k_ade20k/danet_r101-d8_512x512_160k_ade20k_20200616_082348-23bf12f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_80k_cityscapes/danet_r101-d8_512x1024_80k_cityscapes_20200607_132918-955e6350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_40k_cityscapes/danet_r101-d8_512x1024_40k_cityscapes_20200605_200831-c57a7157.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/danet/danet.yml | https://arxiv.org/abs/1809.02983 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_80k_cityscapes/deeplabv3_r50-d8_769x769_80k_cityscapes_20200606_221338-788d6228.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_40k_cityscapes/deeplabv3_r50-d8_769x769_40k_cityscapes_20200606_113723-7eda553c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_80k_ade20k/deeplabv3_r50-d8_512x512_80k_ade20k_20200614_185028-0bb3f844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k_20210709_163016-88675c24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-dc76f3ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k_20210709_155403-51b21115.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-b35f789d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k_20210709_163016-49f2812b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_40k_voc12aug/deeplabv3_r50-d8_512x512_40k_voc12aug_20200613_161546-2ae96e7e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_20k_voc12aug/deeplabv3_r50-d8_512x512_20k_voc12aug_20200617_010906-596905ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_160k_ade20k/deeplabv3_r50-d8_512x512_160k_ade20k_20200615_123227-5d0ee427.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_80k_cityscapes/deeplabv3_r50-d8_512x1024_80k_cityscapes_20200606_113404-b92cfdd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_40k_cityscapes/deeplabv3_r50-d8_512x1024_40k_cityscapes_20200605_022449-acadc2f8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_769x769_80k_cityscapes/deeplabv3_r50b-d8_769x769_80k_cityscapes_20201225_155404-87fb0cf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_512x1024_80k_cityscapes/deeplabv3_r50b-d8_512x1024_80k_cityscapes_20201225_155148-ec368954.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_769x769_80k_cityscapes/deeplabv3_r18-d8_769x769_80k_cityscapes_20201225_021506-6452126a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_512x1024_80k_cityscapes/deeplabv3_r18-d8_512x1024_80k_cityscapes_20201225_021506-23dffbe2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_769x769_80k_cityscapes/deeplabv3_r18b-d8_769x769_80k_cityscapes_20201225_094144-fdc985d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_512x1024_80k_cityscapes/deeplabv3_r18b-d8_512x1024_80k_cityscapes_20201225_094144-46040cef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-774d9cec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_80k_cityscapes/deeplabv3_r101-d8_769x769_80k_cityscapes_20200607_013353-60e95418.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_40k_cityscapes/deeplabv3_r101-d8_769x769_40k_cityscapes_20200606_113809-c64f889f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_80k_ade20k/deeplabv3_r101-d8_512x512_80k_ade20k_20200615_021256-d89c7fa4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k_20210709_201252-13600dc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-636cb433.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k_20210709_155402-3cbca14d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-c49752cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k_20210709_155402-f035acfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_40k_voc12aug/deeplabv3_r101-d8_512x512_40k_voc12aug_20200613_161432-0017d784.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_20k_voc12aug/deeplabv3_r101-d8_512x512_20k_voc12aug_20200617_010932-8d13832f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_160k_ade20k/deeplabv3_r101-d8_512x512_160k_ade20k_20200615_105816-b1f72b3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_80k_cityscapes/deeplabv3_r101-d8_512x1024_80k_cityscapes_20200606_113503-9e428899.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_40k_cityscapes/deeplabv3_r101-d8_512x1024_40k_cityscapes_20200605_012241-7fd3f799.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context_59/deeplabv3_r101-d8_480x480_80k_pascal_context_59_20210416_113002-26303993.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context/deeplabv3_r101-d8_480x480_80k_pascal_context_20200911_170155-2a21fff3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context_59/deeplabv3_r101-d8_480x480_40k_pascal_context_59_20210416_110332-cb08ea46.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context/deeplabv3_r101-d8_480x480_40k_pascal_context_20200911_204118-1aa27336.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-57bb8425.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_769x769_80k_cityscapes/deeplabv3_r101b-d8_769x769_80k_cityscapes_20201226_190843-9142ee57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_512x1024_80k_cityscapes/deeplabv3_r101b-d8_512x1024_80k_cityscapes_20201226_171821-8fd49503.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3/deeplabv3.yml | https://arxiv.org/abs/1706.05587 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_80k_cityscapes/deeplabv3plus_r50-d8_769x769_80k_cityscapes_20200606_210233-0e9dfdc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_40k_cityscapes/deeplabv3plus_r50-d8_769x769_40k_cityscapes_20200606_114143-1dcb0e3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_potsdam/deeplabv3plus_r50-d8_512x512_80k_potsdam_20211219_031508-7e7a2b24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_loveda/deeplabv3plus_r50-d8_512x512_80k_loveda_20211105_080442-f0720392.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_ade20k/deeplabv3plus_r50-d8_512x512_80k_ade20k_20200614_185028-bf1400d8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_40k_voc12aug/deeplabv3plus_r50-d8_512x512_40k_voc12aug_20200613_161759-e1b43aa9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_20k_voc12aug/deeplabv3plus_r50-d8_512x512_20k_voc12aug_20200617_102323-aad58ef1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_160k_ade20k/deeplabv3plus_r50-d8_512x512_160k_ade20k_20200615_124504-6135c7e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_80k_cityscapes/deeplabv3plus_r50-d8_512x1024_80k_cityscapes_20200606_114049-f9fb496d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_40k_cityscapes/deeplabv3plus_r50-d8_512x1024_40k_cityscapes_20200605_094610-d222ffcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_4x4_896x896_80k_isaid/deeplabv3plus_r50-d8_4x4_896x896_80k_isaid_20220110_180526-598be439.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen_20211231_230816-5040938d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_769x769_80k_cityscapes/deeplabv3plus_r50b-d8_769x769_80k_cityscapes_20201225_224655-8b596d1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes_20201225_213645-a97e4e43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_769x769_80k_cityscapes/deeplabv3plus_r18-d8_769x769_80k_cityscapes_20201226_083346-f326e06a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x512_80k_potsdam/deeplabv3plus_r18-d8_512x512_80k_potsdam_20211219_020601-75fd5bc3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x512_80k_loveda/deeplabv3plus_r18-d8_512x512_80k_loveda_20211104_132800-ce0fa0ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x1024_80k_cityscapes/deeplabv3plus_r18-d8_512x1024_80k_cityscapes_20201226_080942-cff257fe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_4x4_896x896_80k_isaid/deeplabv3plus_r18-d8_4x4_896x896_80k_isaid_20220110_180526-7059991d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen_20211231_230805-7626a263.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_769x769_80k_cityscapes/deeplabv3plus_r18b-d8_769x769_80k_cityscapes_20201226_151312-2c868aff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes_20201226_090828-e451abd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-f1104f4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_80k_cityscapes/deeplabv3plus_r101-d8_769x769_80k_cityscapes_20220406_154720-dfcc0b68.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_40k_cityscapes/deeplabv3plus_r101-d8_769x769_40k_cityscapes_20200606_114304-ff414b9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_potsdam/deeplabv3plus_r101-d8_512x512_80k_potsdam_20211219_031508-8b112708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_loveda/deeplabv3plus_r101-d8_512x512_80k_loveda_20211105_110759-4c1f297e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_ade20k/deeplabv3plus_r101-d8_512x512_80k_ade20k_20200615_014139-d5730af7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_40k_voc12aug/deeplabv3plus_r101-d8_512x512_40k_voc12aug_20200613_205333-faf03387.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_20k_voc12aug/deeplabv3plus_r101-d8_512x512_20k_voc12aug_20200617_102345-c7ff3d56.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_160k_ade20k/deeplabv3plus_r101-d8_512x512_160k_ade20k_20200615_123232-38ed86bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_cityscapes_20200606_114143-068fcfe9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_40k_cityscapes/deeplabv3plus_r101-d8_512x1024_40k_cityscapes_20200605_094614-3769eecf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen_20211231_230816-8a095afa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59_20210416_111127-7ca0331d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context/deeplabv3plus_r101-d8_480x480_80k_pascal_context_20200911_155322-145d3ee8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59_20210416_111233-ed937f15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context/deeplabv3plus_r101-d8_480x480_40k_pascal_context_20200911_165459-d3c8a29e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-ee6158e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-cf9ce186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_769x769_80k_cityscapes/deeplabv3plus_r101b-d8_769x769_80k_cityscapes_20201226_205041-227cdf7c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes_20201226_190843-9c3c93a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/deeplabv3plus/deeplabv3plus.yml | https://arxiv.org/abs/1802.02611 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_80k_cityscapes/dmnet_r50-d8_769x769_80k_cityscapes_20201215_034006-6060840e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_40k_cityscapes/dmnet_r50-d8_769x769_40k_cityscapes_20201215_093706-e7f0e23e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_80k_ade20k/dmnet_r50-d8_512x512_80k_ade20k_20201215_144744-f89092a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_160k_ade20k/dmnet_r50-d8_512x512_160k_ade20k_20201215_115313-025ab3f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_80k_cityscapes/dmnet_r50-d8_512x1024_80k_cityscapes_20201215_053728-3c8893b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_40k_cityscapes/dmnet_r50-d8_512x1024_40k_cityscapes_20201215_042326-615373cf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_80k_cityscapes/dmnet_r101-d8_769x769_80k_cityscapes_20201215_082810-7f0de59a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_40k_cityscapes/dmnet_r101-d8_769x769_40k_cityscapes_20201215_081348-a74261f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_80k_ade20k/dmnet_r101-d8_512x512_80k_ade20k_20201215_104812-bfa45311.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_160k_ade20k/dmnet_r101-d8_512x512_160k_ade20k_20201215_111145-a0bc02ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_80k_cityscapes/dmnet_r101-d8_512x1024_80k_cityscapes_20201215_031718-fa081cb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_40k_cityscapes/dmnet_r101-d8_512x1024_40k_cityscapes_20201215_043100-8291e976.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dmnet/dmnet.yml | https://openaccess.thecvf.com/content_ICCV_2019/papers/He_Dynamic_Multi-Scale_Filters_for_Semantic_Segmentation_ICCV_2019_paper.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_80k_cityscapes/dnl_r50-d8_769x769_80k_cityscapes_20200820_011925-366bc4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_40k_cityscapes/dnl_r50-d8_769x769_40k_cityscapes_20200820_232206-0f283785.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_80k_ade20k/dnl_r50-d8_512x512_80k_ade20k_20200826_183354-1cf6e0c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_160k_ade20k/dnl_r50-d8_512x512_160k_ade20k_20200826_183350-37837798.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_80k_cityscapes/dnl_r50-d8_512x1024_80k_cityscapes_20200904_233629-58b2f778.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_40k_cityscapes/dnl_r50-d8_512x1024_40k_cityscapes_20200904_233629-53d4ea93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_80k_cityscapes/dnl_r101-d8_769x769_80k_cityscapes_20200821_051111-95ff84ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_40k_cityscapes/dnl_r101-d8_769x769_40k_cityscapes_20200820_171256-76c596df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_80k_ade20k/dnl_r101-d8_512x512_80k_ade20k_20200826_183354-d820d6ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_160k_ade20k/dnl_r101-d8_512x512_160k_ade20k_20200826_183350-ed522c61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_80k_cityscapes/dnl_r101-d8_512x1024_80k_cityscapes_20200904_233629-758e2dd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_40k_cityscapes/dnl_r101-d8_512x1024_40k_cityscapes_20200904_233629-9928ffef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dnlnet/dnlnet.yml | https://arxiv.org/abs/2006.06668 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dpt/dpt.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dpt/dpt_vit-b16_512x512_160k_ade20k/dpt_vit-b16_512x512_160k_ade20k-db31cf52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/dpt/dpt.yml | https://arxiv.org/abs/2103.13413 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_769x769_80k_cityscapes/emanet_r50-d8_769x769_80k_cityscapes_20200901_100301-16f8de52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_512x1024_80k_cityscapes/emanet_r50-d8_512x1024_80k_cityscapes_20200901_100301-c43fcef1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_769x769_80k_cityscapes/emanet_r101-d8_769x769_80k_cityscapes_20200901_100301-47a324ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_512x1024_80k_cityscapes/emanet_r101-d8_512x1024_80k_cityscapes_20200901_100301-2d970745.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/emanet/emanet.yml | https://arxiv.org/abs/1907.13426 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_80k_cityscapes/encnet_r50-d8_769x769_80k_cityscapes_20200622_003554-55096dcb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_40k_cityscapes/encnet_r50-d8_769x769_40k_cityscapes_20200621_220958-3bcd2884.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_80k_ade20k/encnet_r50-d8_512x512_80k_ade20k_20200622_042412-44b46b04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_160k_ade20k/encnet_r50-d8_512x512_160k_ade20k_20200622_101059-b2db95e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_80k_cityscapes/encnet_r50-d8_512x1024_80k_cityscapes_20200622_003554-fc5c5624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_40k_cityscapes/encnet_r50-d8_512x1024_40k_cityscapes_20200621_220958-68638a47.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_80k_cityscapes/encnet_r101-d8_769x769_80k_cityscapes_20200622_003555-470ef79d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_40k_cityscapes/encnet_r101-d8_769x769_40k_cityscapes_20200621_220933-2fafed55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_80k_ade20k/encnet_r101-d8_512x512_80k_ade20k_20200622_101128-dd35e237.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_160k_ade20k/encnet_r101-d8_512x512_160k_ade20k_20200622_073348-7989641f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_80k_cityscapes/encnet_r101-d8_512x1024_80k_cityscapes_20200622_003555-1de64bec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_40k_cityscapes/encnet_r101-d8_512x1024_40k_cityscapes_20200621_220933-35e0a3e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/encnet/encnet.yml | https://arxiv.org/abs/1803.08904 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/erfnet/erfnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/erfnet/erfnet_fcn_4x4_512x1024_160k_cityscapes/erfnet_fcn_4x4_512x1024_160k_cityscapes_20220704_162145-dc90157a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k/fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k_20210930_225137-993d07c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k/fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k_20211008_105455-e8f5a2fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes_20210928_053722-57749bed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes_20210925_061841-77e87b0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k/fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k_20210930_225214-65aef6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k/fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k_20211008_105456-d875ce3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes_20210928_030036-78da5046.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes_20210926_093217-e1eb6dbb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k/fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k_20211013_190619-3aa40f2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k/fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k_20211008_152246-27036aee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes_20210928_053722-5d1a2648.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes_20210924_214357-72220849.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastfcn/fastfcn.yml | https://arxiv.org/abs/1903.11816 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastscnn/fastscnn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fast_scnn/fast_scnn_lr0.12_8x4_160k_cityscapes/fast_scnn_lr0.12_8x4_160k_cityscapes_20210630_164853-0cec9937.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fastscnn/fastscnn.yml | https://arxiv.org/abs/1902.04502 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_80k_cityscapes/fcn_r50-d8_769x769_80k_cityscapes_20200606_195749-f5caeabc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_40k_cityscapes/fcn_r50-d8_769x769_40k_cityscapes_20200606_113104-977b5d02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_80k_ade20k/fcn_r50-d8_512x512_80k_ade20k_20200614_144016-f8ac5082.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_40k_voc12aug/fcn_r50-d8_512x512_40k_voc12aug_20200613_161222-5e2dbf40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_20k_voc12aug/fcn_r50-d8_512x512_20k_voc12aug_20200617_010715-52dc5306.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_160k_ade20k/fcn_r50-d8_512x512_160k_ade20k_20200615_100713-4edbc3b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_80k_cityscapes/fcn_r50-d8_512x1024_80k_cityscapes_20200606_113019-03aa804d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_40k_cityscapes/fcn_r50-d8_512x1024_40k_cityscapes_20200604_192608-efe53f0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_769x769_80k_cityscapes/fcn_r50b-d8_769x769_80k_cityscapes_20201225_094223-94552d38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_512x1024_80k_cityscapes/fcn_r50b-d8_512x1024_80k_cityscapes_20201225_094221-82957416.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_769x769_80k_cityscapes/fcn_r18-d8_769x769_80k_cityscapes_20201225_021451-9739d1b8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_512x1024_80k_cityscapes/fcn_r18-d8_512x1024_80k_cityscapes_20201225_021327-6c50f8b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_769x769_80k_cityscapes/fcn_r18b-d8_769x769_80k_cityscapes_20201226_004430-32d504e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_512x1024_80k_cityscapes/fcn_r18b-d8_512x1024_80k_cityscapes_20201225_230143-92c0f445.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_fp16_512x1024_80k_cityscapes/fcn_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230921-fb13e883.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_80k_cityscapes/fcn_r101-d8_769x769_80k_cityscapes_20200606_214354-45cbac68.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_40k_cityscapes/fcn_r101-d8_769x769_40k_cityscapes_20200606_113208-7d4ab69c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_80k_ade20k/fcn_r101-d8_512x512_80k_ade20k_20200615_014143-bc1809f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_40k_voc12aug/fcn_r101-d8_512x512_40k_voc12aug_20200613_161240-4c8bcefd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_20k_voc12aug/fcn_r101-d8_512x512_20k_voc12aug_20200617_010842-0bb4e798.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_160k_ade20k/fcn_r101-d8_512x512_160k_ade20k_20200615_105816-fd192bd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_80k_cityscapes/fcn_r101-d8_512x1024_80k_cityscapes_20200606_113038-3fb937eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_40k_cityscapes/fcn_r101-d8_512x1024_40k_cityscapes_20200604_181852-a883d3a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context_59/fcn_r101-d8_480x480_80k_pascal_context_59_20210416_110804-9a6f2c94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context/fcn_r101-d8_480x480_80k_pascal_context_20210421_163310-4711813f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context_59/fcn_r101-d8_480x480_40k_pascal_context_59_20210415_230724-8cf83682.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context/fcn_r101-d8_480x480_40k_pascal_context_20210421_154757-b5e97937.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_769x769_80k_cityscapes/fcn_r101b-d8_769x769_80k_cityscapes_20201226_170012-82be37e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_512x1024_80k_cityscapes/fcn_r101b-d8_512x1024_80k_cityscapes_20201226_160213-4543858f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_80k_cityscapes/fcn_d6_r50-d16_769x769_80k_cityscapes_20210305_200413-109d88eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_40k_cityscapes/fcn_d6_r50-d16_769x769_40k_cityscapes_20210305_185744-1aab18ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_80k_cityscapes/fcn_d6_r50-d16_512x1024_80k_cityscapes_20210306_115604-133c292f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_40k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes_20210305_130133-98d5d1bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50b-d16_769x769_80k_cityscapes/fcn_d6_r50b-d16_769x769_80k_cityscapes_20210311_131012-d665f231.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50b-d16_512x1024_80k_cityscapes/fcn_d6_r50b-d16_512x1024_80k_cityscapes_20210311_125550-6a0b62e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_80k_cityscapes/fcn_d6_r101-d16_769x769_80k_cityscapes_20210306_120016-e33adc4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_40k_cityscapes/fcn_d6_r101-d16_769x769_40k_cityscapes_20210308_102453-60b114e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_80k_cityscapes/fcn_d6_r101-d16_512x1024_80k_cityscapes_20210308_102747-cb336445.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_40k_cityscapes/fcn_d6_r101-d16_512x1024_40k_cityscapes_20210305_130337-9cf2b450.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101b-d16_769x769_80k_cityscapes/fcn_d6_r101b-d16_769x769_80k_cityscapes_20210311_154527-c4d8bfbc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101b-d16_512x1024_80k_cityscapes/fcn_d6_r101b-d16_512x1024_80k_cityscapes_20210311_144305-3f2eb5b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/fcn/fcn.yml | https://arxiv.org/abs/1411.4038 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_80k_cityscapes/gcnet_r50-d8_769x769_80k_cityscapes_20200619_092516-4839565b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_40k_cityscapes/gcnet_r50-d8_769x769_40k_cityscapes_20200618_182814-a26f4471.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_80k_ade20k/gcnet_r50-d8_512x512_80k_ade20k_20200614_185146-91a6da41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_40k_voc12aug/gcnet_r50-d8_512x512_40k_voc12aug_20200613_195105-9797336d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_20k_voc12aug/gcnet_r50-d8_512x512_20k_voc12aug_20200617_165701-3cbfdab1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_160k_ade20k/gcnet_r50-d8_512x512_160k_ade20k_20200615_224122-d95f3e1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes/gcnet_r50-d8_512x1024_80k_cityscapes_20200618_074450-ef8f069b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_40k_cityscapes/gcnet_r50-d8_512x1024_40k_cityscapes_20200618_074436-4b0fd17b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_80k_cityscapes/gcnet_r101-d8_769x769_80k_cityscapes_20200619_092628-8e043423.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_40k_cityscapes/gcnet_r101-d8_769x769_40k_cityscapes_20200619_092550-ca4f0a84.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_80k_ade20k/gcnet_r101-d8_512x512_80k_ade20k_20200615_020811-c3fcb6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_40k_voc12aug/gcnet_r101-d8_512x512_40k_voc12aug_20200613_185806-1e38208d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_20k_voc12aug/gcnet_r101-d8_512x512_20k_voc12aug_20200617_165713-6c720aa9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_160k_ade20k/gcnet_r101-d8_512x512_160k_ade20k_20200615_225406-615528d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_80k_cityscapes/gcnet_r101-d8_512x1024_80k_cityscapes_20200618_074450-778ebf69.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_40k_cityscapes/gcnet_r101-d8_512x1024_40k_cityscapes_20200618_074436-5e62567f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/gcnet/gcnet.yml | https://arxiv.org/abs/1904.11492 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_potsdam/fcn_hr48_512x512_80k_potsdam_20211219_020601-97434c78.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_loveda/fcn_hr48_512x512_80k_loveda_20211211_044756-67072f55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_ade20k/fcn_hr48_512x512_80k_ade20k_20200614_193946-7ba5258d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_40k_voc12aug/fcn_hr48_512x512_40k_voc12aug_20200613_222111-1b0f18bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_20k_voc12aug/fcn_hr48_512x512_20k_voc12aug_20200617_224419-89de05cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_160k_ade20k/fcn_hr48_512x512_160k_ade20k_20200614_214407-a52fc02c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_80k_cityscapes/fcn_hr48_512x1024_80k_cityscapes_20200601_202606-58ea95d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_40k_cityscapes/fcn_hr48_512x1024_40k_cityscapes_20200601_014240-a989b146.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_160k_cityscapes/fcn_hr48_512x1024_160k_cityscapes_20200602_190946-59b7973e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_4x4_896x896_80k_isaid/fcn_hr48_4x4_896x896_80k_isaid_20220114_174643-547fc420.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_4x4_512x512_80k_vaihingen/fcn_hr48_4x4_512x512_80k_vaihingen_20211231_231244-7133cb22.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context_59/fcn_hr48_480x480_80k_pascal_context_59_20210411_003240-3ae7081e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context/fcn_hr48_480x480_80k_pascal_context_20200911_155322-847a6711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context_59/fcn_hr48_480x480_40k_pascal_context_59_20210410_122738-b808b8b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context/fcn_hr48_480x480_40k_pascal_context_20200911_164852-667d00b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_potsdam/fcn_hr18s_512x512_80k_potsdam_20211218_205517-ba32af63.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_loveda/fcn_hr18s_512x512_80k_loveda_20211210_203228-60a86a7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_ade20k/fcn_hr18s_512x512_80k_ade20k_20200614_144345-77fc814a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_40k_voc12aug/fcn_hr18s_512x512_40k_voc12aug_20200614_000648-4f8d6e7f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_20k_voc12aug/fcn_hr18s_512x512_20k_voc12aug_20210829_174910-0aceadb4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_160k_ade20k/fcn_hr18s_512x512_160k_ade20k_20210829_174739-f1e7c2e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_80k_cityscapes/fcn_hr18s_512x1024_80k_cityscapes_20200601_202700-1462b75d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_40k_cityscapes/fcn_hr18s_512x1024_40k_cityscapes_20200601_014216-93db27d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_160k_cityscapes/fcn_hr18s_512x1024_160k_cityscapes_20200602_190901-4a0797ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_4x4_896x896_80k_isaid/fcn_hr18s_4x4_896x896_80k_isaid_20220118_001603-3cc0769b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_4x4_512x512_80k_vaihingen/fcn_hr18s_4x4_512x512_80k_vaihingen_20211231_230909-b23aae02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_potsdam/fcn_hr18_512x512_80k_potsdam_20211218_205517-5d0387ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_loveda/fcn_hr18_512x512_80k_loveda_20211210_203952-93d9c3b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_ade20k/fcn_hr18_512x512_80k_ade20k_20210827_114910-6c9382c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_40k_voc12aug/fcn_hr18_512x512_40k_voc12aug_20200613_224401-1b4b76cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_20k_voc12aug/fcn_hr18_512x512_20k_voc12aug_20200617_224503-488d45f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_160k_ade20k/fcn_hr18_512x512_160k_ade20k_20200614_214426-ca961836.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_80k_cityscapes/fcn_hr18_512x1024_80k_cityscapes_20200601_223255-4e7b345e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_40k_cityscapes/fcn_hr18_512x1024_40k_cityscapes_20200601_014216-f196fb4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_160k_cityscapes/fcn_hr18_512x1024_160k_cityscapes_20200602_190822-221e4a4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_4x4_896x896_80k_isaid/fcn_hr18_4x4_896x896_80k_isaid_20220110_182230-49bf752e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_4x4_512x512_80k_vaihingen/fcn_hr18_4x4_512x512_80k_vaihingen_20211231_231216-2ec3ae8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_in1k-pre_832x832_80k_cityscapes/icnet_r50-d8_in1k-pre_832x832_80k_cityscapes_20210926_032943-1743dc7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_in1k-pre_832x832_160k_cityscapes/icnet_r50-d8_in1k-pre_832x832_160k_cityscapes_20210926_042715-ce310aea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_832x832_80k_cityscapes/icnet_r50-d8_832x832_80k_cityscapes_20210926_044625-c6407341.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_832x832_160k_cityscapes/icnet_r50-d8_832x832_160k_cityscapes_20210925_232612-a95f0d4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_in1k-pre_832x832_80k_cityscapes/icnet_r18-d8_in1k-pre_832x832_80k_cityscapes_20210925_230354-1cbe3022.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_in1k-pre_832x832_160k_cityscapes/icnet_r18-d8_in1k-pre_832x832_160k_cityscapes_20210926_052702-619c8ae1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_832x832_80k_cityscapes/icnet_r18-d8_832x832_80k_cityscapes_20210925_225521-2e36638d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_832x832_160k_cityscapes/icnet_r18-d8_832x832_160k_cityscapes_20210925_230153-2c6eb6e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_in1k-pre_832x832_80k_cityscapes/icnet_r101-d8_in1k-pre_832x832_80k_cityscapes_20210926_020414-7ceb12c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_in1k-pre_832x832_160k_cityscapes/icnet_r101-d8_in1k-pre_832x832_160k_cityscapes_20210925_232612-9484ae8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_832x832_80k_cityscapes/icnet_r101-d8_832x832_80k_cityscapes_20210926_072447-b52f936e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_832x832_160k_cityscapes/icnet_r101-d8_832x832_160k_cityscapes_20210926_092350-3a1ebf1a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/icnet/icnet.yml | https://arxiv.org/abs/1704.08545 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_769x769_80k_cityscapes/isanet_r50-d8_769x769_80k_cityscapes_20210903_101126-99b54519.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_769x769_40k_cityscapes/isanet_r50-d8_769x769_40k_cityscapes_20210903_050200-4ae7e65b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_80k_ade20k/isanet_r50-d8_512x512_80k_ade20k_20210903_124557-6ed83a0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_40k_voc12aug/isanet_r50-d8_512x512_40k_voc12aug_20210901_151349-7d08a54e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_20k_voc12aug/isanet_r50-d8_512x512_20k_voc12aug_20210901_164838-79d59b80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_160k_ade20k/isanet_r50-d8_512x512_160k_ade20k_20210903_104850-f752d0a3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x1024_80k_cityscapes/isanet_r50-d8_512x1024_80k_cityscapes_20210901_074202-89384497.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x1024_40k_cityscapes/isanet_r50-d8_512x1024_40k_cityscapes_20210901_054739-981bd763.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_769x769_80k_cityscapes/isanet_r101-d8_769x769_80k_cityscapes_20210903_111319-24f71dfa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_769x769_40k_cityscapes/isanet_r101-d8_769x769_40k_cityscapes_20210903_111320-509e7224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_80k_ade20k/isanet_r101-d8_512x512_80k_ade20k_20210903_162056-68b235c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_40k_voc12aug/isanet_r101-d8_512x512_40k_voc12aug_20210901_145814-bc71233b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_20k_voc12aug/isanet_r101-d8_512x512_20k_voc12aug_20210901_115805-3ccbf355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_160k_ade20k/isanet_r101-d8_512x512_160k_ade20k_20210903_211431-a7879dcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x1024_80k_cityscapes/isanet_r101-d8_512x1024_80k_cityscapes_20210901_145243-5b99c9b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x1024_40k_cityscapes/isanet_r101-d8_512x1024_40k_cityscapes_20210901_145553-293e6bd6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/isanet/isanet.yml | https://arxiv.org/abs/1907.12273 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k_20220303_133059-7545e1dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k_20220720_165636-cbcaed32.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k_20220303_154559-d8da9a90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220304_125657-215753b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_054634-d2c72240.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_043751-abcab920.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_041642-00c8fbeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet.yml | https://arxiv.org/abs/2106.14855 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220308-f41b89d3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mae/mae.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mae/upernet_mae-base_fp16_8x2_512x512_160k_ade20k/upernet_mae-base_fp16_8x2_512x512_160k_ade20k_20220426_174752-f92a2975.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x512_160k_ade20k/pspnet_m-v2-d8_512x512_160k_ade20k_20200825_214953-f5942f7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x1024_80k_cityscapes/pspnet_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-19e81d51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x512_160k_ade20k/fcn_m-v2-d8_512x512_160k_ade20k_20200825_214953-c40e1095.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x1024_80k_cityscapes/fcn_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-d24c28c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x512_160k_ade20k/deeplabv3plus_m-v2-d8_512x512_160k_ade20k_20200825_223255-465a01d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-d256dd4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x512_160k_ade20k/deeplabv3_m-v2-d8_512x512_160k_ade20k_20200825_223255-63986343.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x1024_80k_cityscapes/deeplabv3_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-bef03590.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes_20201224_223935-03daeabb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes/lraspp_m-v3s-d8_512x1024_320k_cityscapes_20201224_223935-61565b34.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes_20201224_220337-9f29cd72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_512x1024_320k_cityscapes/lraspp_m-v3-d8_512x1024_320k_cityscapes_20201224_220337-cfe8fb07.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/mobilenet_v3/mobilenet_v3.yml | https://arxiv.org/abs/1905.02244 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_80k_cityscapes/nonlocal_r50-d8_769x769_80k_cityscapes_20200607_193506-1f9792f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_40k_cityscapes/nonlocal_r50-d8_769x769_40k_cityscapes_20200530_045243-82ef6749.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_80k_ade20k/nonlocal_r50-d8_512x512_80k_ade20k_20200615_015801-5ae0aa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_40k_voc12aug/nonlocal_r50-d8_512x512_40k_voc12aug_20200614_000028-0139d4a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_20k_voc12aug/nonlocal_r50-d8_512x512_20k_voc12aug_20200617_222613-07f2a57c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_160k_ade20k/nonlocal_r50-d8_512x512_160k_ade20k_20200616_005410-baef45e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_80k_cityscapes/nonlocal_r50-d8_512x1024_80k_cityscapes_20200607_193518-d6839fae.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_40k_cityscapes/nonlocal_r50-d8_512x1024_40k_cityscapes_20200605_210748-c75e81e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_80k_cityscapes/nonlocal_r101-d8_769x769_80k_cityscapes_20200607_183428-0e1fa4f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_40k_cityscapes/nonlocal_r101-d8_769x769_40k_cityscapes_20200530_045348-8fe9a9dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_80k_ade20k/nonlocal_r101-d8_512x512_80k_ade20k_20200615_015758-24105919.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_40k_voc12aug/nonlocal_r101-d8_512x512_40k_voc12aug_20200614_000028-7e5ff470.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_20k_voc12aug/nonlocal_r101-d8_512x512_20k_voc12aug_20200617_222615-948c68ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_160k_ade20k/nonlocal_r101-d8_512x512_160k_ade20k_20210827_221502-7881aa1a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_80k_cityscapes/nonlocal_r101-d8_512x1024_80k_cityscapes_20200607_183411-32700183.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_40k_cityscapes/nonlocal_r101-d8_512x1024_40k_cityscapes_20200605_210748-d63729fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/nonlocal_net/nonlocal_net.yml | https://arxiv.org/abs/1711.07971 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_80k_b16_cityscapes/ocrnet_r101-d8_512x1024_80k_b16_cityscapes_20200723_192421-78688424.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b8_cityscapes/ocrnet_r101-d8_512x1024_40k_b8_cityscapes_20200717_110721-02ac0f13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b16_cityscapes/ocrnet_r101-d8_512x1024_40k_b16_cityscapes_20200723_193726-db500f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_80k_ade20k/ocrnet_hr48_512x512_80k_ade20k_20200615_021518-d168c2d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_40k_voc12aug/ocrnet_hr48_512x512_40k_voc12aug_20200614_015958-255bc5ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_20k_voc12aug/ocrnet_hr48_512x512_20k_voc12aug_20200617_233932-9e82080a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_160k_ade20k/ocrnet_hr48_512x512_160k_ade20k_20200615_184705-a073726d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_80k_cityscapes/ocrnet_hr48_512x1024_80k_cityscapes_20200601_222752-9076bcdf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_40k_cityscapes/ocrnet_hr48_512x1024_40k_cityscapes_20200601_033336-55b32491.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_160k_cityscapes/ocrnet_hr48_512x1024_160k_cityscapes_20200602_191037-dfbf1b0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_80k_ade20k/ocrnet_hr18s_512x512_80k_ade20k_20200615_055600-e80b62af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_40k_voc12aug/ocrnet_hr18s_512x512_40k_voc12aug_20200614_002025-42b587ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_20k_voc12aug/ocrnet_hr18s_512x512_20k_voc12aug_20200617_233913-02b04fcb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_160k_ade20k/ocrnet_hr18s_512x512_160k_ade20k_20200615_184505-8e913058.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_80k_cityscapes/ocrnet_hr18s_512x1024_80k_cityscapes_20200601_222735-55979e63.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_40k_cityscapes/ocrnet_hr18s_512x1024_40k_cityscapes_20200601_033304-fa2436c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_160k_cityscapes/ocrnet_hr18s_512x1024_160k_cityscapes_20200602_191005-f4a7af28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_80k_ade20k/ocrnet_hr18_512x512_80k_ade20k_20200615_053157-d173d83b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_40k_voc12aug/ocrnet_hr18_512x512_40k_voc12aug_20200614_015958-714302be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_20k_voc12aug/ocrnet_hr18_512x512_20k_voc12aug_20200617_233932-8954cbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_160k_ade20k/ocrnet_hr18_512x512_160k_ade20k_20200615_200940-d8fcd9d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_80k_cityscapes/ocrnet_hr18_512x1024_80k_cityscapes_20200614_230521-c2e1dd4a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_40k_cityscapes/ocrnet_hr18_512x1024_40k_cityscapes_20200601_033320-401c5bdd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_160k_cityscapes/ocrnet_hr18_512x1024_160k_cityscapes_20200602_191001-b9172d0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/ocrnet/ocrnet.yml | https://arxiv.org/abs/1909.11065 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x512_160k_ade20k/pointrend_r50_512x512_160k_ade20k_20200807_232644-ac3febf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x1024_80k_cityscapes/pointrend_r50_512x1024_80k_cityscapes_20200711_015821-bb1ff523.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x512_160k_ade20k/pointrend_r101_512x512_160k_ade20k_20200808_030852-8834902a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x1024_80k_cityscapes/pointrend_r101_512x1024_80k_cityscapes_20200711_170850-d0ca84be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/point_rend/point_rend.yml | https://arxiv.org/abs/1912.08193 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_m36_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m36_3rdparty_32xb128_in1k_20220414-c55e0949.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_m48_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-m48_3rdparty_32xb128_in1k_20220414-9378f3eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_s24_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s24_3rdparty_32xb128_in1k_20220414-d7055904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/fpn_poolformer_s36_8x4_512x512_40k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/poolformer/poolformer-s36_3rdparty_32xb128_in1k_20220414-d78ff3e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/poolformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/poolformer/fpn_poolformer_s36_8x4_512x512_40k_ade20k/fpn_poolformer_s36_8x4_512x512_40k_ade20k_20220501_151122-b47e607d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/poolformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/poolformer/fpn_poolformer_s24_8x4_512x512_40k_ade20k/fpn_poolformer_s24_8x4_512x512_40k_ade20k_20220503_222049-394a7cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/poolformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/poolformer/fpn_poolformer_s12_8x4_512x512_40k_ade20k/fpn_poolformer_s12_8x4_512x512_40k_ade20k_20220501_115154-b5aa2f49.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/poolformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/poolformer/fpn_poolformer_m48_8x4_512x512_40k_ade20k/fpn_poolformer_m48_8x4_512x512_40k_ade20k_20220504_003923-64168d3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/poolformer/poolformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/poolformer/fpn_poolformer_m36_8x4_512x512_40k_ade20k/fpn_poolformer_m36_8x4_512x512_40k_ade20k_20220501_164230-3dc83921.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_80k_cityscapes/psanet_r50-d8_769x769_80k_cityscapes_20200606_225134-fe42f49e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_40k_cityscapes/psanet_r50-d8_769x769_40k_cityscapes_20200530_033717-d5365506.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_80k_ade20k/psanet_r50-d8_512x512_80k_ade20k_20200614_144141-835e4b97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_40k_voc12aug/psanet_r50-d8_512x512_40k_voc12aug_20200613_161946-f596afb5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_20k_voc12aug/psanet_r50-d8_512x512_20k_voc12aug_20200617_102413-2f1bbaa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_160k_ade20k/psanet_r50-d8_512x512_160k_ade20k_20200615_161258-148077dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_80k_cityscapes/psanet_r50-d8_512x1024_80k_cityscapes_20200606_161842-ab60a24f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_40k_cityscapes/psanet_r50-d8_512x1024_40k_cityscapes_20200606_103117-99fac37c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_80k_cityscapes/psanet_r101-d8_769x769_80k_cityscapes_20200606_214550-7665827b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_40k_cityscapes/psanet_r101-d8_769x769_40k_cityscapes_20200530_035107-997da1e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_80k_ade20k/psanet_r101-d8_512x512_80k_ade20k_20200614_185117-1fab60d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_40k_voc12aug/psanet_r101-d8_512x512_40k_voc12aug_20200613_161946-1f560f9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_20k_voc12aug/psanet_r101-d8_512x512_20k_voc12aug_20200617_110624-946fef11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_160k_ade20k/psanet_r101-d8_512x512_160k_ade20k_20200615_161537-dbfa564c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_80k_cityscapes/psanet_r101-d8_512x1024_80k_cityscapes_20200606_161823-0f73a169.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_40k_cityscapes/psanet_r101-d8_512x1024_40k_cityscapes_20200606_001418-27b9cfa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/psanet/psanet.yml | https://openaccess.thecvf.com/content_ECCV_2018/papers/Hengshuang_Zhao_PSANet_Point-wise_Spatial_ECCV_2018_paper.pdf | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220315_123238-588c30be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_80k_cityscapes/pspnet_r50-d8_769x769_80k_cityscapes_20200606_210121-5ccf03dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_40k_cityscapes/pspnet_r50-d8_769x769_40k_cityscapes_20200606_112725-86638686.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_loveda/pspnet_r50-d8_512x512_80k_loveda_20211104_155728-88610f9f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_ade20k/pspnet_r50-d8_512x512_80k_ade20k_20200615_014128-15a8b914.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-0e41b2db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k/pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_030857-92e2902b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-be9610cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k/pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k_20210820_203258-b88df27f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-51276a57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_40k_voc12aug/pspnet_r50-d8_512x512_40k_voc12aug_20200613_161222-ae9c1b8c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_20k_voc12aug/pspnet_r50-d8_512x512_20k_voc12aug_20200617_101958-ed5dfbd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_160k_ade20k/pspnet_r50-d8_512x512_160k_ade20k_20200615_184358-1890b0bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_80k_cityscapes/pspnet_r50-d8_512x1024_80k_cityscapes_20200606_112131-2376f12b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_40k_cityscapes/pspnet_r50-d8_512x1024_40k_cityscapes_20200605_003338-2966598c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_896x896_80k_isaid/pspnet_r50-d8_4x4_896x896_80k_isaid_20220110_180629-1f21dc32.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_512x512_80k_vaihingen/pspnet_r50-d8_4x4_512x512_80k_vaihingen_20211228_160355-382f8f5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_512x512_80k_potsdam/pspnet_r50-d8_4x4_512x512_80k_potsdam_20211219_043541-2dd5fe67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220316_141229-dd9c9610.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d32_512x1024_80k_cityscapes/pspnet_r50-d32_512x1024_80k_cityscapes_20220316_224840-9092b254.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_769x769_80k_cityscapes/pspnet_r50b-d8_769x769_80k_cityscapes_20201225_094316-4c643cf6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_512x1024_80k_cityscapes/pspnet_r50b-d8_512x1024_80k_cityscapes_20201225_094315-6344287a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d32_512x1024_80k_cityscapes/pspnet_r50b-d32_512x1024_80k_cityscapes_20220311_152152-23bcaf8c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_769x769_80k_cityscapes/pspnet_r18-d8_769x769_80k_cityscapes_20201225_021458-3deefc62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x512_80k_loveda/pspnet_r18-d8_512x512_80k_loveda_20211105_052100-b97697f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x1024_80k_cityscapes/pspnet_r18-d8_512x1024_80k_cityscapes_20201225_021458-09ffa746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_896x896_80k_isaid/pspnet_r18-d8_4x4_896x896_80k_isaid_20220110_180526-e84c0b6a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_512x512_80k_vaihingen/pspnet_r18-d8_4x4_512x512_80k_vaihingen_20211228_160355-52a8a6f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_512x512_80k_potsdam/pspnet_r18-d8_4x4_512x512_80k_potsdam_20211220_125612-7cd046e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_769x769_80k_cityscapes/pspnet_r18b-d8_769x769_80k_cityscapes_20201226_080942-bf98d186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_512x1024_80k_cityscapes/pspnet_r18b-d8_512x1024_80k_cityscapes_20201226_063116-26928a60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_fp16_512x1024_80k_cityscapes/pspnet_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230919-a0875e5c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_80k_cityscapes/pspnet_r101-d8_769x769_80k_cityscapes_20200606_225055-dba412fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_40k_cityscapes/pspnet_r101-d8_769x769_40k_cityscapes_20200606_112753-61c6f5be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_loveda/pspnet_r101-d8_512x512_80k_loveda_20211104_153212-1c06c6a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_ade20k/pspnet_r101-d8_512x512_80k_ade20k_20200614_031423-b6e782f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-7eb41789.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k/pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_014022-831aec95.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-72220c60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k/pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k_20210820_232135-76aae482.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-4af9621b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_40k_voc12aug/pspnet_r101-d8_512x512_40k_voc12aug_20200613_161222-bc933b18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_20k_voc12aug/pspnet_r101-d8_512x512_20k_voc12aug_20200617_102003-4aef3c9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_160k_ade20k/pspnet_r101-d8_512x512_160k_ade20k_20200615_100650-967c316f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_80k_cityscapes/pspnet_r101-d8_512x1024_80k_cityscapes_20200606_112211-e1e1100f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_40k_cityscapes/pspnet_r101-d8_512x1024_40k_cityscapes_20200604_232751-467e7cf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_4x4_512x512_80k_vaihingen/pspnet_r101-d8_4x4_512x512_80k_vaihingen_20211231_230806-8eba0a09.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_4x4_512x512_80k_potsdam/pspnet_r101-d8_4x4_512x512_80k_potsdam_20211220_125612-aed036c4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context_59/pspnet_r101-d8_480x480_80k_pascal_context_59_20210416_114418-fa6caaa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context/pspnet_r101-d8_480x480_80k_pascal_context_20200911_190530-c86d6233.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context_59/pspnet_r101-d8_480x480_40k_pascal_context_59_20210416_114524-86d44cd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context/pspnet_r101-d8_480x480_40k_pascal_context_20200911_211210-bf0f5d7c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_769x769_80k_cityscapes/pspnet_r101b-d8_769x769_80k_cityscapes_20201226_171823-f0e7c293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_512x1024_80k_cityscapes/pspnet_r101b-d8_512x1024_80k_cityscapes_20201226_170012-3a4d38ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet.yml | https://arxiv.org/abs/1612.01105 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x512_160k_ade20k/pspnet_s101-d8_512x512_160k_ade20k_20200807_145416-a6daa92a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x1024_80k_cityscapes/pspnet_s101-d8_512x1024_80k_cityscapes_20200807_140631-c75f3b99.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x512_160k_ade20k/fcn_s101-d8_512x512_160k_ade20k_20200807_145416-d3160329.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x1024_80k_cityscapes/fcn_s101-d8_512x1024_80k_cityscapes_20200807_140631-f8d155b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x512_160k_ade20k/deeplabv3plus_s101-d8_512x512_160k_ade20k_20200807_144503-27b26226.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x1024_80k_cityscapes/deeplabv3plus_s101-d8_512x1024_80k_cityscapes_20200807_144429-1239eb43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x512_160k_ade20k/deeplabv3_s101-d8_512x512_160k_ade20k_20200807_144503-17ecabe5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x1024_80k_cityscapes/deeplabv3_s101-d8_512x1024_80k_cityscapes_20200807_144429-b73c4270.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes/segformer_mit-b5_8x1_1024x1024_160k_cityscapes_20211206_072934-87a052ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_640x640_160k_ade20k/segformer_mit-b5_640x640_160k_ade20k_20220617_203542-940a6bd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_512x512_160k_ade20k/segformer_mit-b5_512x512_160k_ade20k_20210726_145235-94cedf59.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes/segformer_mit-b4_8x1_1024x1024_160k_cityscapes_20211207_080709-07f6c333.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b4_512x512_160k_ade20k/segformer_mit-b4_512x512_160k_ade20k_20220620_112216-4fa4f58f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes/segformer_mit-b3_8x1_1024x1024_160k_cityscapes_20211206_224823-a8f8a177.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b3_512x512_160k_ade20k/segformer_mit-b3_512x512_160k_ade20k_20220617_162254-3a4b7363.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes/segformer_mit-b2_8x1_1024x1024_160k_cityscapes_20211207_134205-6096669a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b2_512x512_160k_ade20k/segformer_mit-b2_512x512_160k_ade20k_20220620_114047-64e4feca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes/segformer_mit-b1_8x1_1024x1024_160k_cityscapes_20211208_064213-655c7b3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b1_512x512_160k_ade20k/segformer_mit-b1_512x512_160k_ade20k_20220620_112037-c3f39e00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes/segformer_mit-b0_8x1_1024x1024_160k_cityscapes_20211208_101857-e7f88502.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b0_512x512_160k_ade20k/segformer_mit-b0_512x512_160k_ade20k_20220617_162207-c00b9603.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer.yml | https://arxiv.org/abs/2105.15203 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b0_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b3_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b4_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b5_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b5_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k/segmenter_vit-t_mask_8x1_512x512_160k_ade20k_20220105_151706-ffcf7509.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k/segmenter_vit-s_mask_8x1_512x512_160k_ade20k_20220105_151706-511bb103.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-s_linear_8x1_512x512_160k_ade20k/segmenter_vit-s_linear_8x1_512x512_160k_ade20k_20220105_151713-39658c46.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k/segmenter_vit-l_mask_8x1_640x640_160k_ade20k_20220614_024513-4783a347.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-b_mask_8x1_512x512_160k_ade20k/segmenter_vit-b_mask_8x1_512x512_160k_ade20k_20220105_151706-bc533b08.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter.yml | https://arxiv.org/abs/2105.05633 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_large_p16_384_20220308-d4efb41d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_small_p16_384_20220308-410f6037.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_tiny_p16_384_20220308-cce8c795.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x512_160k_ade20k/fpn_r50_512x512_160k_ade20k_20200718_131734-5b5a6ab9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x1024_80k_cityscapes/fpn_r50_512x1024_80k_cityscapes_20200717_021437-94018a0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x512_160k_ade20k/fpn_r101_512x512_160k_ade20k_20200718_131734-306b5004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x1024_80k_cityscapes/fpn_r101_512x1024_80k_cityscapes_20200717_012416-c5800d4c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/sem_fpn/sem_fpn.yml | https://arxiv.org/abs/1901.02446 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_pup_vit-large_8x1_768x768_80k_cityscapes/setr_pup_vit-large_8x1_768x768_80k_cityscapes_20211122_155115-f6f37b8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_pup_512x512_160k_b16_ade20k/setr_pup_512x512_160k_b16_ade20k_20210619_191343-7e0ce826.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_naive_vit-large_8x1_768x768_80k_cityscapes/setr_naive_vit-large_8x1_768x768_80k_cityscapes_20211123_000505-20728e80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_naive_512x512_160k_b16_ade20k/setr_naive_512x512_160k_b16_ade20k_20210619_191258-061f24f5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_vit-large_8x1_768x768_80k_cityscapes/setr_mla_vit-large_8x1_768x768_80k_cityscapes_20211119_101003-7f8dccbe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_512x512_160k_b8_ade20k/setr_mla_512x512_160k_b8_ade20k_20210619_191118-c6d21df0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_512x512_160k_b16_ade20k/setr_mla_512x512_160k_b16_ade20k_20210619_191057-f9741de7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/setr/setr.yml | https://arxiv.org/abs/2012.15840 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes/stdc2_in1k-pre_512x1024_80k_cityscapes_20220224_073048-1f8f0f6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc2_512x1024_80k_cityscapes/stdc2_512x1024_80k_cityscapes_20220222_132015-fb1e3a1a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes/stdc1_in1k-pre_512x1024_80k_cityscapes_20220224_141648-3d4c2981.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc1_512x1024_80k_cityscapes/stdc1_512x1024_80k_cityscapes_20220224_073048-74e6920a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc.yml | https://arxiv.org/abs/2104.13188 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc1_20220308-5368626c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc2_20220308-7dbd9127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210531_112542-e380ad3e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192015-ee2fff1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_large_patch4_window7_512x512_pretrain_224x224_22K_160k_ade20k/upernet_swin_large_patch4_window7_512x512_pretrain_224x224_22K_160k_ade20k_20220318_015320-48d180dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_large_patch4_window12_512x512_pretrain_384x384_22K_160k_ade20k/upernet_swin_large_patch4_window12_512x512_pretrain_384x384_22K_160k_ade20k_20220318_091743-9ba68901.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K_20210526_211650-762e2178.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192340-593b0e13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K_20210531_125459-429057bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K_20210531_132020-05b22ea4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_20220317-55b0104a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_22k_20220317-e5c09f74.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_20220317-e9b98025.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_22k_20220317-4f79f7c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_large_patch4_window12_512x512_pretrain_384x384_22K_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window12_384_22k_20220412-6580f57d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_large_patch4_window7_512x512_pretrain_224x224_22K_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220412-aeecf2aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_small_patch4_window7_224_20220317-7ba6d6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220317-1cdeb081.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k/twins_svt-s_uperhead_8x2_512x512_160k_ade20k_20211130_141005-e48a2d94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141006-0a0d3317.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k/twins_svt-l_uperhead_8x2_512x512_160k_ade20k_20211130_141005-3e2cae61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141005-1d59bee2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k/twins_svt-b_uperhead_8x2_512x512_160k_ade20k_20211202_040826-0943a1f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_113849-88b2907c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k/twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k_20211201_233537-8e99c07a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_204132-41acd132.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k_20211201_075053-c6095c07.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_105226-bc6d61dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k_20211130_141020-02094ea5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141019-d396db72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201821-22b3e3ba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201823-53d492fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201823-f1063ef7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201823-c0802c4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_64x64_40k_drive/pspnet_unet_s5-d16_64x64_40k_drive_20201227_181818-aac73387.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_256x256_40k_hrf/pspnet_unet_s5-d16_256x256_40k_hrf_20201227_181818-fdb7e29b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_stare/pspnet_unet_s5-d16_128x128_40k_stare_20201227_181818-3c2923c4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_chase_db1/pspnet_unet_s5-d16_128x128_40k_chase_db1_20201227_181818-68d4e609.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201820-785de5c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201821-c314da8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201821-f75705a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201821-1c4eb7cf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_64x64_40k_drive/fcn_unet_s5-d16_64x64_40k_drive_20201223_191051-5daf6d3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes_20211210_145204-6860854e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_256x256_40k_hrf/fcn_unet_s5-d16_256x256_40k_hrf_20201223_173724-d89cf1ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_stare/fcn_unet_s5-d16_128x128_40k_stare_20201223_191051-7d77e78b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_chase_db1/fcn_unet_s5-d16_128x128_40k_chase_db1_20201223_191051-11543527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201825-6bf0efd7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_202032-59daf7a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201825-21db614c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201825-4ef29df5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_64x64_40k_drive/deeplabv3_unet_s5-d16_64x64_40k_drive_20201226_094047-0671ff20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_256x256_40k_hrf/deeplabv3_unet_s5-d16_256x256_40k_hrf_20201226_094047-3a1fdf85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_stare/deeplabv3_unet_s5-d16_128x128_40k_stare_20201226_094047-93dcb93c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_chase_db1/deeplabv3_unet_s5-d16_128x128_40k_chase_db1_20201226_094047-4c5aefa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | https://arxiv.org/abs/1505.04597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/unet/unet.yml | http://lmb.informatik.uni-freiburg.de/people/ronneber/u-net | 源码链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_80k_cityscapes/upernet_r50_769x769_80k_cityscapes_20200607_005107-82ae7d15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_40k_cityscapes/upernet_r50_769x769_40k_cityscapes_20200530_033048-92d21539.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_80k_ade20k/upernet_r50_512x512_80k_ade20k_20200614_144127-ecc8377b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_40k_voc12aug/upernet_r50_512x512_40k_voc12aug_20200613_162257-ca9bcc6b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_20k_voc12aug/upernet_r50_512x512_20k_voc12aug_20200617_165330-5b5890a7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_160k_ade20k/upernet_r50_512x512_160k_ade20k_20200615_184328-8534de8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_80k_cityscapes/upernet_r50_512x1024_80k_cityscapes_20200607_052207-848beca8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_40k_cityscapes/upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_80k_ade20k/upernet_r18_512x512_80k_ade20k_20220614_110319-22e81719.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_40k_voc12aug/upernet_r18_512x512_40k_voc12aug_20220614_153605-fafeb868.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_20k_voc12aug/upernet_r18_512x512_20k_voc12aug_20220614_123910-ed66e455.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_160k_ade20k/upernet_r18_512x512_160k_ade20k_20220615_113300-791c3f3e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x1024_80k_cityscapes/upernet_r18_512x1024_80k_cityscapes_20220614_110712-c89a9188.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x1024_40k_cityscapes/upernet_r18_512x1024_40k_cityscapes_20220615_113231-12ee861d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_80k_cityscapes/upernet_r101_769x769_80k_cityscapes_20200607_001014-082fc334.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_40k_cityscapes/upernet_r101_769x769_40k_cityscapes_20200530_040819-83c95d01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_80k_ade20k/upernet_r101_512x512_80k_ade20k_20200614_185117-32e4db94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_40k_voc12aug/upernet_r101_512x512_40k_voc12aug_20200613_163549-e26476ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_20k_voc12aug/upernet_r101_512x512_20k_voc12aug_20200617_165629-f14e7f27.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_160k_ade20k/upernet_r101_512x512_160k_ade20k_20200615_161951-91b32684.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_80k_cityscapes/upernet_r101_512x1024_80k_cityscapes_20200607_002403-f05f2345.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_40k_cityscapes/upernet_r101_512x1024_40k_cityscapes_20200605_094933-ebce3b10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/upernet/upernet.yml | https://arxiv.org/pdf/1807.10221.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_mln_512x512_80k_ade20k/upernet_vit-b16_mln_512x512_80k_ade20k_20210624_130547-0403cee1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_mln_512x512_160k_ade20k/upernet_vit-b16_mln_512x512_160k_ade20k_20210624_130547-852fa768.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_ln_mln_512x512_160k_ade20k/upernet_vit-b16_ln_mln_512x512_160k_ade20k_20210621_172828-f444c077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_mln_512x512_160k_ade20k/upernet_deit-s16_mln_512x512_160k_ade20k_20210621_161021-fb9a5dfb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_ln_mln_512x512_160k_ade20k/upernet_deit-s16_ln_mln_512x512_160k_ade20k_20210621_161021-c0cd652f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_512x512_80k_ade20k/upernet_deit-s16_512x512_80k_ade20k_20210624_095228-afc93ec2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_512x512_160k_ade20k/upernet_deit-s16_512x512_160k_ade20k_20210621_160903-5110d916.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_mln_512x512_160k_ade20k/upernet_deit-b16_mln_512x512_160k_ade20k_20210621_191949-4e1450f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_ln_mln_512x512_160k_ade20k/upernet_deit-b16_ln_mln_512x512_160k_ade20k_20210623_153535-8a959c14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_512x512_80k_ade20k/upernet_deit-b16_512x512_80k_ade20k_20210624_130529-1e090789.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_512x512_160k_ade20k/upernet_deit-b16_512x512_160k_ade20k_20210621_180100-828705d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 公钥链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/docker/serve/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/docker/serve/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 公钥链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/BiseNetV1_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/public_address_statement.md b/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/public_address_statement.md index 538746f054e45b3e5d65f7a562d3812a2c2a546f..f610c6ed356e476dfbd9bdeb3e646b9f5eee5a6c 100644 --- a/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/public_address_statement.md @@ -1,28 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------|------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/doc/deeplab_resnet.py | DeepLabv3+_ID1695_for_PyTorch/doc/deeplab_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/doc/deeplab_xception.py | DeepLabv3+_ID1695_for_PyTorch/doc/deeplab_xception.py | http://data.lip6.fr/cadene/pretrainedmodels/xception-b5690688.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/modeling/backbone/drn.py | DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/drn.py | http://dl.yf.io/drn/ | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/modeling/backbone/drn.py | DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/drn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/modeling/backbone/mobilenet.py | DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/mobilenet.py | http://jeff95.me/models/mobilenet_v2-6a65762b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/modeling/backbone/resnet.py | DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/9135e104a7a51ea9effa9c6676a2fcffe6a6a2e6/modeling/backbone/xception.py | DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/xception.py | http://data.lip6.fr/cadene/pretrainedmodels/xception-b5690688.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/doc/deeplab_resnet.py|DeepLabv3+_ID1695_for_PyTorch/doc/deeplab_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/doc/deeplab_xception.py|DeepLabv3+_ID1695_for_PyTorch/doc/deeplab_xception.py | http://data.lip6.fr/cadene/pretrainedmodels/xception-b5690688.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/utils/lr_scheduler.py|DeepLabv3+_ID1695_for_PyTorch/utils/lr_scheduler.py | zhang.hang@rutgers.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/backbone/drn.py|DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/drn.py | http://dl.yf.io/drn | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/backbone/drn.py|DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/drn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/backbone/mobilenet.py|DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/mobilenet.py | http://jeff95.me/models/mobilenet_v2-6a65762b.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/batchnorm.py | maojiayuan@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/batchnorm.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/doc/deeplab_resnet.py|DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/batchnorm.py | http://tetexiao.co | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/comm.py | maojiayuan@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/comm.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/replicate.py | maojiayuan@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/replicate.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/doc/deeplab_xception.py|DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/xception.py | http://data.lip6.fr/cadene/pretrainedmodels/xception-b5690688.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/unittest.py | maojiayuan@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/unittest.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/__init__.py | maojiayuan@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/sync_batchnorm/batchnorm.py|DeepLabv3+_ID1695_for_PyTorch/modeling/sync_batchnorm/__init__.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/drn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/drn.py | http://dl.yf.io/drn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DeepLabv3+_ID1695_for_PyTorch/modeling/backbone/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/public_address_statement.md b/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/public_address_statement.md index 9777c6567dc164dbfb3bba47a1c60c65e4e45b84..84f273cf7f8ecb9c665af87287e96e823ae0a3d3 100644 --- a/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/public_address_statement.md +++ b/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/public_address_statement.md @@ -1,65 +1,46 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/ade20k.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/ADEChallengeData2016.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/ade20k.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/release_test.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/mscoco.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/mscoco.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/mscoco.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/pascal_voc.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/pascal_voc.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/pascal_voc.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/pascal_voc.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/sbu_shadow.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/sbu_shadow.py | http://www3.cs.stonybrook.edu/~cvl/content/datasets/shadow_db/SBU-shadow.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/densenet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/densenet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/densenet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/densenet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnetv1b.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnetv1b.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnetv1b.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnetv1b.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnetv1b.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnext.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnext.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/vgg.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/filesystem.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/filesystem.py | https://github.com/zhreshold/cocoapi.git#subdirectory=PythonAPI | 下载第三方包 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | DynamicUNet_for_Pytorch/resnet.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/filesystem.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/filesystem.py | http://github.com/user/repo/tarball/master/egginfo=xxx | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/lr_scheduler.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/lr_scheduler.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/solver/lr_scheduler.py | 源码实现 | -| 开发引入 | / | DynamicUNet_for_Pytorch/resnet.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/loss.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/loss.py | https://github.com/zhanghang1989/PyTorch-Encoding/blob/master/encoding/nn/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/parallel.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/parallel.py | https://github.com/zhanghang1989/PyTorch-Encoding/blob/master/encoding/parallel.py | 源码实现 | -| 开发引入 | / | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开发引入 | / | DynamicUNet_for_Pytorch/resnet.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/logger.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/logger.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/utils/logger.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/distributed.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/distributed.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/utils/comm.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/model_store.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/model_store.py | https://hangzh.s3.amazonaws.com/ | 相关说明 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/mscoco.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/test2017.zip | 数据集地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/nn/jpu.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/nn/jpu.py | https://github.com/wuhuikai/FastFCN/blob/master/encoding/nn/customize.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/models/base_models/resnet.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/data/downloader/mscoco.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/stuff_annotations_trainval2017.zip | 数据集地址 | -| 开发引入 | / | DynamicUNet_for_Pytorch/resnet.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | DynamicUNet_for_Pytorch/resnet.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/blob/master/core/utils/distributed.py | DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/utils/distributed.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/data/samplers/distributed.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/ADEChallengeData2016.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/release_test.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/data/downloader/sbu_shadow.py | http://www3.cs.stonybrook.edu/~cvl/content/datasets/shadow_db/SBU-shadow.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/awesome-semantic-segmentation-pytorch/core/models/model_store.py | https://hangzh.s3.amazonaws.com/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/cv/semantic_segmentation/DynamicUNet_for_Pytorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/diffusers/examples/SD3/public_address_statement.md b/PyTorch/built-in/diffusion/diffusers/examples/SD3/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..f7e99835c1fe59507cf517e19b5976da872c2b51 --- /dev/null +++ b/PyTorch/built-in/diffusion/diffusers/examples/SD3/public_address_statement.md @@ -0,0 +1,6 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers/examples/SD3/train_dreambooth_lora_sd3.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers/examples/SD3/train_dreambooth_lora_sd3.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers/examples/SD3/train_dreambooth_sd3.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers/examples/SD3/train_dreambooth_sd3.py | https://www.tensorflow.org/tensorboard | 设置说明 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/diffusers0.17.0/public_address_statement.md b/PyTorch/built-in/diffusion/diffusers0.17.0/public_address_statement.md index 69f33d7a81f24bb0481ecad40d6822a8412db5d5..7cb4da8a04058d44feaf80870b28192039235d94 100644 --- a/PyTorch/built-in/diffusion/diffusers0.17.0/public_address_statement.md +++ b/PyTorch/built-in/diffusion/diffusers0.17.0/public_address_statement.md @@ -1,1235 +1,81 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------|---------------------------------------------------------------------|--------------------------------------------------|---------| -| 开源代码引入 | https://knn.laion.ai/knn-service|\examples\custom_diffusion\retrieve.py |https://knn.laion.ai/knn-service | 下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\custom_diffusion\train_custom_diffusion.py|https://huggingface.co | 设置说明| -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\dreambooth\train_dreambooth.py |https://huggingface.co | 设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard | \examples\custom_diffusion\train_custom_diffusion.py |https://www.tensorflow.org|设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard | \examples\dreambooth\train_dreambooth_flax.py |https://www.tensorflow.org|设置说明| -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\custom_diffusion\train_custom_diffusion.py|https://pytorch.org |设置说明| -| 开源代码引入 | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html |\examples\custom_diffusion\train_custom_diffusion.py|https://pytorch.org |设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers |\examples\dreambooth\train_dreambooth_lora.py|https://huggingface.co/ |设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers |\examples\custom_diffusion\train_custom_diffusion.py;|https://huggingface.co/ |设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint| \examples\dreambooth\train_dreambooth.py |https://huggingface.co/|设置说明| -| 开源代码引入 | https://www.crosslabs.org//blog/diffusion-with-offset-noise | \examples\dreambooth\train_dreambooth.py |https://www.crosslabs.org|设置说明| -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder | \examples\instruct_pix2pix\train_instruct_pix2pix.py |https://huggingface.co/|设置说明| -| 开源代码引入 | https://arxiv.org/abs/2211.09800 | \examples\instruct_pix2pix\train_instruct_pix2pix.py |https://arxiv.org |设置说明| -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | \examples\instruct_pix2pix\train_instruct_pix2pix.py |https://pytorch.org|设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\instruct_pix2pix\train_instruct_pix2pix.py|https://www.tensorflow.org/|设置说明| -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state|\examples\instruct_pix2pix\train_instruct_pix2pix.py|https://huggingface.co|设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers |\examples\instruct_pix2pix\train_instruct_pix2pix.py| https://huggingface.co |设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\research_projects\colossalai\train_dreambooth_colossalai.py | https://www.tensorflow.org/ |设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint_lora.py|https://www.tensorflow.org/|设置说明| -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state|\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint_lora.py|https://huggingface.co/|设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint.py |https://www.tensorflow.org |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state |\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint.py |https://huggingface.co |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\research_projects\intel_opts\textual_inversion\textual_inversion_bf16.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\research_projects\intel_opts\textual_inversion_dfq\textual_inversion.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | git+https://github.com/huggingface/peft.git |\examples\research_projects\lora\requirements.txt |git+https://github.com/huggingface/peft.git |下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder |\examples\research_projects\lora\train_text_to_image_lora.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | \examples\research_projects\lora\train_text_to_image_lora.py|https://pytorch.org |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard | \examples\research_projects\lora\train_text_to_image_lora.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state | \examples\research_projects\lora\train_text_to_image_lora.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers | \examples\research_projects\lora\train_text_to_image_lora.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\mulit_token_textual_inversion\textual_inversion_flax.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\mulit_token_textual_inversion\textual_inversion.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\research_projects\mulit_token_textual_inversion\textual_inversion.py | https://pytorch.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\mulit_token_textual_inversion\textual_inversion.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\research_projects\mulit_token_textual_inversion\textual_inversion.py| https://huggingface.co/docs/diffusers/main/en/optimization/xformers|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\multi_subject_dreambooth\train_multi_subject_dreambooth.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\research_projects\multi_subject_dreambooth\train_multi_subject_dreambooth.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\research_projects\multi_subject_dreambooth\train_multi_subject_dreambooth.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://arxiv.org/abs/2303.09556| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py| https://arxiv.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py | https://pytorch.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py |https://pytorch.org |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py | https://huggingface.co/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder|\examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\text_to_image\train_text_to_image_flax.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\text_to_image\train_text_to_image_flax.py|https://www.tensorflow.org |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder|\examples\text_to_image\train_text_to_image_lora.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://arxiv.org/abs/2303.09556|\examples\text_to_image\train_text_to_image_lora.py | https://arxiv.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\text_to_image\train_text_to_image_lora.py | https://pytorch.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\text_to_image\train_text_to_image_lora.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\text_to_image\train_text_to_image_lora.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\text_to_image\train_text_to_image_lora.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://arxiv.org/abs/2303.09556| \examples\text_to_image\train_text_to_image.py|https://arxiv.org |设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\text_to_image\train_text_to_image.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\text_to_image\train_text_to_image.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator| \examples\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\text_to_image\train_text_to_image.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\textual_inversion\textual_inversion_flax.py | https://www.tensorflow.org |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\textual_inversion\textual_inversion.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\textual_inversion\textual_inversion.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\textual_inversion\textual_inversion.py|https://huggingface.co/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\textual_inversion\textual_inversion.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\unconditional_image_generation\train_unconditional.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://www.wandb.ai| \examples\unconditional_image_generation\train_unconditional.py| https://www.wandb.ai|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\unconditional_image_generation\train_unconditional.py| https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/gwf-440k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py | https://model-server.zqevans2.workers.dev | 下载权重 -| 开源代码引入 | https://model-server.zqevans2.workers.dev/jmann-small-190k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/jmann-large-580k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/maestro-uncond-150k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/unlocked-uncond-250k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev/|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/honk-140k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://dl.fbaipublicfiles.com/DiT/models/{model_name} | \scripts\convert_dit_to_diffusers.py| https://dl.fbaipublicfiles.com|下载权重 | -| 开源代码引入 | https://huggingface.co/kakaobrain/karlo-v1-alpha/tree/main/prior| \scripts\convert_original_stable_diffusion_to_diffusers.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml| \scripts\convert_vae_pt_to_diffusers.py| https://raw.githubusercontent.com|下载配置 | -| 开源代码引入 | https://github.com/facebookresearch/xformers|\src\diffusers\models\attention_processor.py | https://github.com|设置说明 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\models\modeling_flax_utils.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/{pretrained_model_name_or_path}|\src\diffusers\models\modeling_flax_utils.py |https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/transformers/installation#offline-mode| \src\diffusers\models\modeling_flax_utils.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\models\modeling_flax_utils.py| https://huggingface.co |设置说明 | -| 开源代码引入 | https://pytorch.org/ and https://flax.readthedocs.io/en/latest/installation.html | \src\diffusers\models\modeling_pytorch_flax_utils.py| https://pytorch.org/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389| \src\diffusers\models\modeling_utils.py| https://github.com/|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg| \src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py|https://raw.githubusercontent.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py| https://github.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py| https://github.com|设置说明 | -| 开源代码引入 | https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png|\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | https://hf.co/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py| https://github.com|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png| \src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png| \src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py| https://github.com|设置说明 | -| 开源代码引入 | https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png| \src\diffusers\pipelines\controlnet\pipeline_controlnet.py| https://hf.co/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\controlnet\pipeline_controlnet.py| https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/YiYiXu/test-doc-assets/resolve/main/blog_post_cell_10_output_0.jpeg|\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py |https://huggingface.co|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | https://raw.githubusercontent.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | https://github.com/ |设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py| https://raw.githubusercontent.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py| https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png|\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | https://huggingface.co/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py| https://github.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py| https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/valhalla/ldm-bert/blob/main/config.json| \src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py| https://huggingface.co/|配置文件| -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml|\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | https://raw.githubusercontent |配置文件 | -| 开源代码引入 | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml| \src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | https://raw.githubusercontent.com/|配置文件 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py| https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py |https://raw.githubusercontent.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png| \src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py| https://raw.githubusercontent.com/|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png| \src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py| https://raw.githubusercontent.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py|https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py|https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py| https://github.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py| https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 |\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | https://github.com |设置说明 | -| 开源代码引入 | https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py| https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py |https://raw.githubusercontent.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 |\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | https://github.com|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/api/schedulers#implemented-schedulers| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py| https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/cat.pt|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | https://hf.co/|设置说明 | -| 开源代码引入 | https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/dog.pt| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py| https://hf.co/|设置说明 | -| 开源代码引入 | https://github.com/pix2pixzero/pix2pix-zero/raw/main/assets/test_images/cats/cat_6.png| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py| https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 |\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | https://github.com/|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py |https://raw.githubusercontent.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py|https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/new|\src\diffusers\pipelines\pipeline_utils.py | https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/runwayml/stable-diffusion-inpainting| \src\diffusers\pipelines\pipeline_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/runwayml/stable-diffusion-inpainting | \src\diffusers\pipelines\pipeline_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py|\src\diffusers\schedulers\scheduling_pndm.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/installation#install-from-source|\src\diffusers\utils\__init__.py |https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co|\src\diffusers\utils\constants.py |"https://huggingface.co |设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py|\src\diffusers\utils\dynamic_modules_utils.py | https://raw.githubusercontent.com |下载依赖 | -| 开源代码引入 | https://pypi.org/pypi/diffusers/json| \src\diffusers\utils\dynamic_modules_utils.py|https://pypi.org/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/new/choose|\src\diffusers\utils\hub_utils.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/new|\src\diffusers\utils\hub_utils.py | https://github.com/ |设置说明 | -| 开源代码引入 | https://huggingface.co/models|\src\diffusers\utils\hub_utils.py | https://huggingface.co/|设置说明 | -| 开源代码引入 | https://huggingface.co/{pretrained_model_name_or_path}|\src\diffusers\utils\hub_utils.py |https://huggingface.co/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/installation#offline-mode| \src\diffusers\utils\hub_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\utils\hub_utils.py|https://huggingface.co/ |设置说明| -| 开源代码引入 | https://github.com/google/flax|\src\diffusers\utils\import_utils.py | https://github.com|设置说明 | -| 开源代码引入 | https://pytorch.org/get-started/locally/| \src\diffusers\utils\import_utils.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://librosa.org/doc/latest/install.html| \src\diffusers\utils\import_utils.py| https://librosa.org|设置说明 | -| 开源代码引入 | https://github.com/rspeer/python-ftfy/tree/master#installing|\src\diffusers\utils\import_utils.py | https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/fusing/diffusers-testing/resolve/main|\src\diffusers\utils\testing_utils.py | https://huggingface.co |下载依赖 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\configuration_utils.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/{pretrained_model_name_or_path}| \src\diffusers\configuration_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/installation#offline-mode| \src\diffusers\configuration_utils.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.load_lora_weights|\src\diffusers\loaders.py |https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/|\src\diffusers\loaders.py |https://huggingface.co/ |下载依赖 | -| 开源代码引入 | huggingface.co/|\src\diffusers\loaders.py |huggingface.co/ |下载依赖 | -| 开源代码引入 | hf.co/|\src\diffusers\loaders.py|hf.co/ |下载依赖 | -| 开源代码引入 | https://hf.co/|\src\diffusers\loaders.py |https://hf.co/ |下载依赖 | -| 开源代码引入 | https://huggingface.co/{ckpt_name}|\utils\check_config_docstrings.py |https://huggingface.co/ |下载依赖 | -| 开源代码引入 | https://huggingface\.co/.+?)\|\utils\check_config_docstrings.py | https://huggingface.co|下载依赖 | -| 开源代码引入 | git+https://github.com/huggingface/doc-builder| \utils\check_repo.py | git+https://github.com/huggingface/doc-builder|下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/model_doc|\utils\release.py | https://huggingface.co/|下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/model_doc| \utils\release.py| https://huggingface.co|下载依赖 | -| 开源代码引入 | https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md|\utils\stale.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers|\CITATION.cff | https://github.com/|配置设置 | -| 开源代码引入 | https://github.com/huggingface/diffusers|\setup.py | https://github.com/ |配置设置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/text_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/testing_utils.py | diffusers0.17.0/src/diffusers/utils/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://github.com/CompVis/taming-transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_mega.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/prior_transformer.py | https://arxiv.org/abs/2204.06125 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_processor.py | diffusers0.17.0/src/diffusers/models/attention_processor.py | https://arxiv.org/abs/2209.09002 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/clip_guided_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/efdf6206d8ed593961593dc029a8affa/decoder-ckpt-step%3D01000000-of-01000000.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kandinsky_to_diffusers.py | diffusers0.17.0/scripts/convert_kandinsky_to_diffusers.py | https://huggingface.co/ai-forever/Kandinsky_2.1/blob/main/prior_fp16.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_mega.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/modeling_pytorch_flax_utils.py | https://pytorch.org/","https://flax.readthedocs.io/en/latest/installation.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/clip_guided_images_mixing_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.17.0/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://huggingface.co/docs/datasets/dataset_script | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/src/diffusers/training_utils.py | https://github.com/fadel/pytorch_ema/blob/master/torch_ema/ema.py#L14 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_tensorrt_txt2img.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://pypi.ngc.nvidia.com | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_original_audioldm_to_diffusers.py | diffusers0.17.0/scripts/convert_original_audioldm_to_diffusers.py | https://huggingface.co/spaces/haoheliu/audioldm-text-to-audio-generation/blob/84a0384742a22bd80c44e903e241f0623e874f1d/audioldm/utils.py#L72-L73 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm_flax.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/85626483eaca9f581e2a78d31ff905ca/prior-ckpt-step%3D01000000-of-01000000.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/lpw_stable_diffusion_onnx.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/img2img_inpainting.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_flax.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_learnable.pth?sv=2020-10-02&st=2022-05-30T10%3A22%3A06Z&se=2030-05-31T10%3A22%3A00Z&sr=b&sp=r&sig=GOE%2 | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/ddim_noise_comparative_analysis.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/vermeer_canny_edged.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_lms_discrete.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/models/modeling_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_karras_ve_flax.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_pytorch_utils.py | diffusers0.17.0/src/diffusers/models/modeling_flax_pytorch_utils.py | https://github.com/patil-suraj/stable-diffusion-jax/blob/main/stable_diffusion_jax/convert_diffusers_to_jax.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/models/modeling_utils.py | https://facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_unclip.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/custom_diffusion/train_custom_diffusion.py | diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://github.com/huggingface/diffusers/blob/main/examples/textual_inversion/textual_inversion.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.RobertaSeriesModelWithTransformation | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/lpw_stable_diffusion_onnx.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/textual_inversion/textual_inversion_flax.py | diffusers0.17.0/examples/research_projects/mulit_token_textual_inversion/textual_inversion_flax.py | https://github.com/deepmind/optax/issues/159#issuecomment-896459491 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://github.com/baofff/U-ViT | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://arxiv.org/pdf/2303.06555.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/img2img_inpainting.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/diffusers/using-diffusers/custom_pipeline_overview | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/ddim_noise_comparative_analysis.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/text_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.XLMRobertaTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/mixture.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://github.com/princeton-vl/RAFT/blob/master/core/utils/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/modeling_flax_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png | 图片地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/5b3af030dd83e0297272d861c19477735d0317ec/k_diffusion/sampling.py#L188 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/dpm_discrete_ancestral.md | diffusers0.17.0/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://github.com/crowsonkb/k-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/mixture_tiling.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png | 图片地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/bit_diffusion.py | diffusers0.17.0/examples/community/bit_diffusion.py | https://github.com/lucidrains/bit-diffusion/blob/main/bit_diffusion/bit_diffusion.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_flax_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/setup.py | https://testpypi.python.org/pypi | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/unclip_image_interpolation.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModelWithProjection | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2205.09991 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_utils_flax.py | https://huggingface.co/transformers/installation.html#offline-mode | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/import_utils.py | diffusers0.17.0/src/diffusers/utils/import_utils.py | https://github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L319 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/thibaud/controlnet-canny-sd21 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/text_inpainting.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/pipelines/pndm/pipeline_pndm.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/mixture_tiling.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_sde_ve.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion.py | https://github.dev/crowsonkb/k-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://github.com/huggingface/transformers/blob/d92e22d1f28324f513f3080e5c47c071a3916721/src/transformers/models/clip/modeling_clip.py#L1052-L1053 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/t2i_adapter/train_t2i_adapter_sdxl.py | diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/src/diffusers/training_utils.py | https://pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/t5_film_transformer.py | diffusers0.17.0/src/diffusers/models/t5_film_transformer.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/tiled_upscaling.py | diffusers0.17.0/examples/community/tiled_upscaling.py | peter@codebuffet.co | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.17.0/src/diffusers/models/attention_flax.py | https://arxiv.org/pdf/1506.02025.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/dit/pipeline_dit.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/weighted_prompts.md | diffusers0.17.0/src/diffusers/utils/testing_utils.py | https://github.com/damian0815/compel | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_vp.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_sde_vp.py | https://github.com/yang-song/score_sde_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/lpw_stable_diffusion_onnx.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://github.com/crowsonkb/k-diffusion/blob/41b4cb6df0506694a7776af31349acf082bf6091/k_diffusion/sampling.py#L543 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/text_inpainting.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/mbart | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/tiled_upscaling.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/_config.py | diffusers0.17.0/docs/source/_config.py | https://github.com/huggingface/diffusers.git | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_mega.py | diffusers0.17.0/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion#diffusers.StableDiffusionPipeline | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/vq_model.py | https://arxiv.org/abs/2112.10752 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_condition_flax.py | diffusers0.17.0/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/2112.05682 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.XLMRobertaTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/pndm/pipeline_pndm.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/textual_inversion/textual_inversion_flax.py | diffusers0.17.0/examples/textual_inversion/textual_inversion_flax.py | https://github.com/deepmind/optax/issues/159#issuecomment-896459491 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/ddim/pipeline_ddim.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_utils.py | diffusers0.17.0/src/diffusers/models/modeling_flax_utils.py | https://github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/thibaud/controlnet-sd21/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/dreambooth/train_dreambooth.py | diffusers0.17.0/examples/dreambooth/train_dreambooth.py | https://dreambooth.github.io/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://github.com/Mikubill/sd-webui-controlnet/discussions/1236","https://github.com/Mikubill/sd-webui-controlnet/discussions/1280 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/tiled_upscaling.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/instruct_pix2pix/train_instruct_pix2pix_sdxl.py | diffusers0.17.0/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://huggingface.co/docs/datasets/main/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion_img2img.py | https://github.dev/crowsonkb/k-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://arxiv.org/pdf/2201.09865.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm.py | https://arxiv.org/abs/2202.09778 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPImageProcessor | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/examples/community/clip_guided_images_mixing_stable_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/img2img_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion_img2img.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_utils.py | diffusers0.17.0/src/diffusers/configuration_utils.py | https://pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard","https://www.wandb.ai | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/testing_utils.py | diffusers0.17.0/src/diffusers/utils/testing_utils.py | https://github.com/huggingface/transformers/blob/3658488ff77ff8d45101293e749263acf437f4d5/src/transformers/testing_utils.py#L1787 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.17.0/src/diffusers/models/attention_flax.py | https://github.com/AminRezaei0x443/memory-efficient-attention | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/0b62380a75e56f073e2844ab5199153d/ViT-L-14_stats.th | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_utils_flax.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/lora.py | diffusers0.17.0/src/diffusers/models/attention_processor.py | https://github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_sde_vp.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/img2img_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://huggingface.co/fusing/karlo-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://github.com/thu-ml/unidiffuser/blob/main/libs/uvit_multi_post_ln_v1.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_lms_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/lambdalabs/sd-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/img2img_inpainting.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L72 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm_flax.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_heun_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L90 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/unclip_text_interpolation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_flax.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/embeddings.py | diffusers0.17.0/src/diffusers/models/embeddings.py | https://github.com/deep-floyd/IF/blob/2f91391f27dd3c468bf174be5805b4cc92980c0b/deepfloyd_if/model/nn.py#L54 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/reproducibility.md | diffusers0.17.0/src/diffusers/utils/testing_utils.py | https://pytorch.org/docs/stable/notes/randomness.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2204.13902 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/models/modeling_utils.py | https://hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.17.0/src/diffusers/models/embeddings_flax.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_blocks_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_blocks_flax.py | https://arxiv.org/abs/2103.06104 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_vq_diffusion_to_diffusers.py | diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://github.com/microsoft/VQ-Diffusion/blob/3c98e77f721db7c787b76304fa2c96a36c7b00af/inference_VQ_Diffusion.py#L65 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/text_to_image/train_text_to_image_flax.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/research_projects/lora/train_text_to_image_lora.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_heun_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/pdf/2210.00939.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/experimental/rl/value_guided_sampling.py | https://github.com/jannerm/diffuser | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/text_inpainting.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://www.cs.cmu.edu/~custom-diffusion | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/tiled_upscaling.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/configuration_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/imagic_stable_diffusion.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://arxiv.org/pdf/2210.09276.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/import_utils.py | diffusers0.17.0/src/diffusers/utils/import_utils.py | https://github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L338 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/text_to_image/train_text_to_image_lora.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/ddim/pipeline_ddim.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ipndm.py | https://arxiv.org/abs/2202.09778 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg | 图片地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.17.0/examples/community/bit_diffusion.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_utils.py | diffusers0.17.0/src/diffusers/models/modeling_flax_utils.py | https://github.com/google/flax/issues/1261 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm.py | https://github.com/CompVis/latent-diffusion/pull/51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/configuration_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/research_projects/mulit_token_textual_inversion/multi_token_clip.py | diffusers0.17.0/examples/research_projects/mulit_token_textual_inversion/multi_token_clip.py | https://github.com/rinongal/textual_inversion/pull/119 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/repaint/pipeline_repaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/loaders.py | https://github.com/huggingface/diffusers/pull/3490#issuecomment-1555059060 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/loaders.py | https://huggingface.co/","https://hf.co/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/clip_guided_images_mixing_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/repaint/pipeline_repaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/setup.py | diffusers0.17.0/setup.py | https://test.pypi.org/legacy/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/utils/dynamic_modules_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_utils_flax.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/attention_processor.py | https://facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/setup.py | diffusers0.17.0/setup.py | patrick@huggingface.co | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/loaders.py | diffusers0.17.0/src/diffusers/loaders.py | https://github.com/huggingface/diffusers/pull/2918 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/loaders.py | https://civitai.com/models/3036?modelVersionId=9857 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.load_from_disk | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_diffusers_to_original_stable_diffusion.py | diffusers0.17.0/scripts/convert_diffusers_to_original_stable_diffusion.py | https://github.com/pytorch/pytorch/blob/master/test/cpp/api/modules.cpp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/setup.py | diffusers0.17.0/setup.py | https://github.com/allenai/allennlp/blob/main/setup.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/text_inpainting.py | https://huggingface.co/docs/transformers/model_doc/clipseg | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_karras_ve.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/loaders.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/lora.py | diffusers0.17.0/src/diffusers/loaders.py | https://github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/ipndm.md | diffusers0.17.0/src/diffusers/schedulers/scheduling_ipndm.py | https://github.com/crowsonkb/v-diffusion-pytorch/blob/987f8985e38208345c1959b0ea767a625831cc9b/diffusion/sampling.py#L296 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/pipelines/repaint/pipeline_repaint.py | https://arxiv.org/pdf/2201.09865.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion.py | https://github.com/Jack000/glid-3-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/t2i_adapter/train_t2i_adapter_sdxl.py | diffusers0.17.0/examples/controlnet/train_controlnet.py | https://huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/hub/security-tokens | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/abs/2302.08113 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/pipeline_flax_utils.py | https://huggingface.co/diffusers/installation.html#offline-mode | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/score_sde_ve/pipeline_score_sde_ve.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/text_to_image/train_text_to_image.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/abs/2211.05105 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/controlling_generation.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/abs/2210.00939 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_karras_ve.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/src/diffusers/loaders.py | https://pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/docs/transformers/model_doc/roberta#transformers.RobertaTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_mega.py | diffusers0.17.0/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion#diffusers.StableDiffusionImg2ImgPipeline | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/embeddings.py | diffusers0.17.0/src/diffusers/models/embeddings.py | https://arxiv.org/abs/2102.12092 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/tiled_upscaling.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/score_sde_ve/pipeline_score_sde_ve.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/unclip_text_interpolation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/pipeline_flax_utils.py | http://hostname | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/speech_to_image_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_vp.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_sde_ve_flax.py | https://github.com/yang-song/score_sde_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://github.com/baofff/U-ViT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/controlling_generation.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/abs/2302.03027 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/stable_diffusion/test_stable_diffusion.py | diffusers0.17.0/src/diffusers/loaders.py | https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned-emaonly.ckpt | 预训练模型 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion_img2img.py | https://github.com/Jack000/glid-3-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_flax.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_condition_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_blocks_flax.py | https://arxiv.org/abs/2112.05682 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_karras_ve_flax.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm_flax.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/configuration_utils.py | diffusers0.17.0/src/diffusers/configuration_utils.py | https://github.com/huggingface/diffusers/pull/3129 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/pipelines/vq_diffusion.md | diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://github.com/microsoft/VQ-Diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/docs/transformers/main/en/model_doc/speecht5#transformers.SpeechT5HifiGan | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://github.com/openai/guided-diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm_flax.py | https://github.com/CompVis/latent-diffusion/pull/51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_flax_utils.py | https://huggingface.co/docs/hub/security-tokens | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard","https://www.wandb.ai | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_repaint.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/text_inpainting.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/utils/dynamic_modules_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/modeling_utils.py | https://huggingface.co/diffusers/installation.html#offline-mode | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_lms_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_pytorch_utils.py | diffusers0.17.0/src/diffusers/models/modeling_flax_pytorch_utils.py | https://github.com/huggingface/transformers/blob/c603c80f46881ae18b2ca50770ef65fa4033eacd/src/transformers/modeling_flax_pytorch_utils.py#L69 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/custom_pipeline_overview.md | diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://arxiv.org/abs/2302.05543 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_tensorrt_txt2img.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://pypi.ngc.nvidia.com | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://arxiv.org/abs/2112.10752 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_flax.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/imagic_stable_diffusion.py | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://github.com/justinpinkney/stable-diffusion/blob/main/notebooks/imagic.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_euler_discrete_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/text_to_image/train_text_to_image_lora.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/utils/check_table.py | diffusers0.17.0/utils/check_table.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/checkpoint_merger.py | diffusers0.17.0/examples/community/checkpoint_merger.py | https://github.com/huggingface/diffusers/issues/877 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/diffusers/installation.html#offline-mode | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_heun_discrete.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/t5_film_transformer.py | diffusers0.17.0/src/diffusers/models/attention.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/bit_diffusion.py | diffusers0.17.0/examples/community/bit_diffusion.py | https://github.com/pytorch/pytorch/issues/27072 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/tiled_upscaling.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/tiled_upscaling.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/testing_utils.py | diffusers0.17.0/src/diffusers/utils/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://huggingface.co/docs/transformers/main/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2210.05559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_comparison.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/unclip_text_interpolation.py | https://en.wikipedia.org/wiki/Slerp | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/resnet.py | diffusers0.17.0/src/diffusers/models/resnet.py | https://github.com/pytorch/pytorch/issues/86679 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/dpm_discrete_ancestral.md | diffusers0.17.0/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://github.com/crowsonkb/k-diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/abs/2303.08084 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.17.0/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/seed_resize_stable_diffusion.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/dance_diffusion/pipeline_dance_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/laion/clap-htsat-unfused | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_unidiffuser_to_diffusers.py | diffusers0.17.0/scripts/convert_unidiffuser_to_diffusers.py | https://huggingface.co/thu-ml/unidiffuser-v1 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/unconditional_image_generation/train_unconditional.py | diffusers0.17.0/examples/unconditional_image_generation/train_unconditional.py | https://github.com/huggingface/accelerate/pull/962/files | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/pdf/2210.11427.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/unconditional_image_generation/train_unconditional.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/text_inpainting.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/tiled_upscaling.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.17.0/src/diffusers/models/attention.py | https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://github.com/huggingface/diffusers/pull/3533 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/modeling_utils.py | http://hostname | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2206.00927","https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://en.wikipedia.org/wiki/Slerp | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_repaint.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/deis.md | diffusers0.17.0/src/diffusers/schedulers/scheduling_deis_multistep.py | https://github.com/qsh-zh/deis | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/ddpm/pipeline_ddpm.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/models/unet_3d_condition.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/mixture.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/loaders.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/unclip_text_interpolation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/unclip_image_interpolation.py | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModelWithProjection | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_parallel.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://arxiv.org/abs/2205.09991 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/examples/community/clip_guided_stable_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_repaint.py | https://arxiv.org/pdf/2201.09865.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2206.00927","https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion_uncond/pipeline_latent_diffusion_uncond.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/dreambooth/train_dreambooth.py | diffusers0.17.0/examples/dreambooth/train_dreambooth_lora.py | https://dreambooth.github.io/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/loaders.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/loaders.py | diffusers0.17.0/src/diffusers/loaders.py | https://huggingface.co/WarriorMama777/OrangeMixs/blob/main/Models/AbyssOrangeMix/AbyssOrangeMix.safetensors | 相关说明 | -| 开发引入 | / | diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_vqvae.pth?sv=2020-10-02&st=2022-05-30T15%3A17%3A18Z&se=2030-05-31T15%3A17%3A00Z&sr=b&sp=r&sig=1jVavHFPp | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_euler_discrete.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/686dbad0f39640ea25c8a8c6a6e56bb40eacefa2/k_diffusion/sampling.py#L17 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_repaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/diffusers/main/en/using-diffusers/contribute_pipeline | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/unconditional_image_generation/train_unconditional.py | diffusers0.17.0/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://github.com/huggingface/accelerate/pull/962/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/unclip_text_interpolation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | https://arxiv.org/pdf/2303.06555.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/ddpm/pipeline_ddpm.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/lpw_stable_diffusion_onnx.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_vq_diffusion.py | https://arxiv.org/abs/2111.14822 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/pdf/2301.12247.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/.github/PULL_REQUEST_TEMPLATE.md | diffusers0.17.0/docs/source/en/using-diffusers/using_safetensors | https://github.com/huggingface/safetensors | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_unidiffuser_to_diffusers.py | diffusers0.17.0/scripts/convert_unidiffuser_to_diffusers.py | https://github.com/thu-ml/unidiffuser/blob/main/configs/sample_unidiffuser_v1.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/optimization.py | diffusers0.17.0/src/diffusers/optimization.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.17.0/examples/community/bit_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_stable_diffusion_controlnet_to_onnx.py | diffusers0.17.0/scripts/convert_stable_diffusion_checkpoint_to_onnx.py | https://github.com/huggingface/transformers/pull/18515/files | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_discrete.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2302.04867","https://github.com/wl-zhao/UniPC | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unclip/text_proj.py | https://arxiv.org/abs/2204.06125 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/utils/stale.py | diffusers0.17.0/utils/stale.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/README.md | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://github.com/huggingface/diffusers/tree/main/examples/community | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/unclip_image_interpolation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/autoencoder_kl.py | https://arxiv.org/abs/2112.10752 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/pndm/pipeline_pndm.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/fusing/karlo-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/composable_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | https://github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/unclip_image_interpolation.py | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModelWithProjection | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2206.00927","https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://huggingface.co/models?filter=ldmbert | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/clip_guided_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.17.0/examples/text_to_image/train_text_to_image_lora.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/lpw_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/configuration_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.17.0/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/2112.05682v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://github.com/python-pillow/Pillow/issues/5610 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.17.0/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/custom_pipeline_overview.md | diffusers0.17.0/src/diffusers/models/controlnet.py | https://arxiv.org/abs/2302.05543 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/speech_to_image_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/configuration_utils.py | diffusers0.17.0/src/diffusers/models/modeling_utils.py | https://github.com/huggingface/diffusers/pull/3129 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/dit/pipeline_dit.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/gpt2#transformers.GPT2Tokenizer | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_deis_multistep.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/checkpoint_merger.py | diffusers0.17.0/examples/community/checkpoint_merger.py | https://en.wikipedia.org/wiki/Smoothstep | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/deepfloyd_if/watermark.py | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/watermark.py | https://github.com/deep-floyd/IF/blob/b77482e36ca2031cb94dbca1001fc1e6400bf4ab/deepfloyd_if/modules/base.py#L287 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/scripts/convert_unidiffuser_to_diffusers.py | https://huggingface.co/gpt2/blob/main/config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/lpw_stable_diffusion_onnx.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | https://github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py#L89 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/sd_text2img_k_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.RobertaSeriesModelWithTransformation | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/4226b831ae0279020d134281f3c31590/improved-sr-ckpt-step%3D1.2M.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.17.0/examples/research_projects/intel_opts/textual_inversion_dfq/textual_inversion.py | https://github.com/fadel/pytorch_ema/blob/master/torch_ema/ema.py#L14 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/custom_diffusion/train_custom_diffusion.py | diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://github.com/huggingface/diffusers/blob/main/examples/custom_diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/unclip_text_interpolation.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_pndm_flax.py | https://arxiv.org/abs/2202.09778 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2204.13902","https://github.com/qsh-zh/deis | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/resnet.py | diffusers0.17.0/src/diffusers/models/resnet.py | https://github.com/huggingface/diffusers/issues/984 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/img2img_inpainting.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png | 图片地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/models/t5_film_transformer.py | https://arxiv.org/abs/1910.07467 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/pdf/2303.06555.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2302.04867 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://huggingface.co/openai/clip-vit-base-patch32 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/wildcard_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/text_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/text_inpainting.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/examples/community/interpolate_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ipndm.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://github.com/openai/guided-diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/examples/community/img2img_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_ipex.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/resnet.py | diffusers0.17.0/src/diffusers/models/resnet.py | https://github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/models/multi_modal/video_synthesis/unet_sd.py#L1016 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_sde_ve_flax.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.17.0/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_condition_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://arxiv.org/abs/2112.05682 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/img2img_inpainting.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | diffusers0.17.0/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/abs/2303.13439 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_euler_discrete.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_tensorrt_txt2img.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://pypi.ngc.nvidia.com | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/utils/check_config_docstrings.py | diffusers0.17.0/utils/check_config_docstrings.py | https://huggingface.co/bert-base-uncased | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_utils.py | diffusers0.17.0/src/diffusers/models/modeling_utils.py | https://pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.17.0/src/diffusers/pipelines/pipeline_utils.py | https://facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/multilingual_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_pytorch_flax_utils.py | diffusers0.17.0/src/diffusers/models/modeling_pytorch_flax_utils.py | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_flax_pytorch_utils.py#L224-L352 | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://github.com/borisdayma/dalle-mini/blob/d2be512d4a6a9cda2d63ba04afc33038f98f705f/src/dalle_mini/data.py#L370 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_vp.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_sde_ve.py | https://github.com/yang-song/score_sde_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.17.0/examples/community/img2img_inpainting.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.17.0/examples/community/stable_diffusion_controlnet_img2img.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://github.com/openai/guided-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/examples/community/stable_diffusion_reference.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddpm.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://raw.githubusercontent.com/microsoft/VQ-Diffusion/main/configs/ithq.yaml | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/imagic_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://raw.githubusercontent.com/microsoft/VQ-Diffusion/main/OUTPUT/pretrained_model/taming_dvae/config.yaml | 相关说明 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/5b3af030dd83e0297272d861c19477735d0317ec/k_diffusion/sampling.py#L188 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.17.0/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开发引入 | / | diffusers0.17.0/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/docs/transformers/main/model_doc/clap#transformers.ClapTextModelWithProjection | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/community/stable_diffusion_controlnet_reference.py | https://github.com/Mikubill/sd-webui-controlnet/discussions/1236","https://github.com/Mikubill/sd-webui-controlnet/discussions/1280 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.17.0/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.17.0/src/diffusers/pipelines/latent_diffusion_uncond/pipeline_latent_diffusion_uncond.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.17.0/examples/research_projects/lora/requirements.txt | https://github.com/huggingface/peft.git | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/.github/actions/setup-miniconda/action.yml | https://repo.anaconda.com/miniconda/Miniconda3-py39_${MINICONDA_VERSION}-${MINICONDA_ARCH}.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/.github/workflows/nightly_tests.yml | https://download.pytorch.org/whl/cpu | pip相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/community/clip_guided_stable_diffusion_img2img.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/community/stable_diffusion_tensorrt_img2img.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/community/stable_diffusion_tensorrt_inpaint.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/community/stable_diffusion_tensorrt_txt2img.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/controlnet/train_controlnet.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/controlnet/train_controlnet_flax.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://www.cs.cmu.edu/~custom-diffusion | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/custom_diffusion/train_custom_diffusion.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth.py | https://www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth_flax.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/dreambooth/train_dreambooth_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://arxiv.org/abs/2211.09800 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/colossalai/train_dreambooth_colossalai.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/dreambooth_inpaint/train_dreambooth_inpaint.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/dreambooth_inpaint/train_dreambooth_inpaint_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/intel_opts/textual_inversion/textual_inversion_bf16.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/intel_opts/textual_inversion_dfq/textual_inversion.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/lora/train_text_to_image_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/lora/train_text_to_image_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/mulit_token_textual_inversion/textual_inversion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/mulit_token_textual_inversion/textual_inversion.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/mulit_token_textual_inversion/textual_inversion_flax.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/multi_subject_dreambooth/train_multi_subject_dreambooth.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/multi_subject_dreambooth/train_multi_subject_dreambooth.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/textual_inversion/textual_inversion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/textual_inversion/textual_inversion.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.wandb.ai | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image_flax.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image_lora.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/text_to_image/train_text_to_image_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/textual_inversion/textual_inversion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/textual_inversion/textual_inversion.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/textual_inversion/textual_inversion_flax.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/unconditional_image_generation/train_unconditional.py | https://www.wandb.ai | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/examples/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_dit_to_diffusers.py | https://dl.fbaipublicfiles.com/DiT/models/{model_name} | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/efdf6206d8ed593961593dc029a8affa/decoder-ckpt-step%3D01000000-of-01000000.ckpt | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/85626483eaca9f581e2a78d31ff905ca/prior-ckpt-step%3D01000000-of-01000000.ckpt | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/4226b831ae0279020d134281f3c31590/improved-sr-ckpt-step%3D1.2M.ckpt | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/0b62380a75e56f073e2844ab5199153d/ViT-L-14_stats.th | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_vae_pt_to_diffusers.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 下载配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://raw.githubusercontent.com/microsoft/VQ-Diffusion/main/OUTPUT/pretrained_model/taming_dvae/config.yaml | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://raw.githubusercontent.com/microsoft/VQ-Diffusion/main/configs/ithq.yaml | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_vqvae.pth?sv=2020-10-02&st=2022-05-30T15%3A17%3A18Z&se=2030-05-31T15%3A17%3A00Z&sr=b&sp=r&sig=1jVavHFPpUjDs%2FTO1V3PTezaNbPp2Nx8MxiWI7y6fEY%3D | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_learnable.pth?sv=2020-10-02&st=2022-05-30T10%3A22%3A06Z&se=2030-05-31T10%3A22%3A00Z&sr=b&sp=r&sig=GOE%2Bza02%2FPnGxYVOOPtwrTR4RA3%2F5NVgMxdW4kjaEZ8%3D | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/setup.py | patrick@huggingface.co | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/models/controlnet_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/models/modeling_pytorch_flax_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/models/unet_2d_condition_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/utils/dynamic_modules_utils.py | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/utils/import_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.17.0/src/diffusers/utils/import_utils.py | https://librosa.org/doc/latest/install.html | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/diffusers0.18.1/public_address_statement.md b/PyTorch/built-in/diffusion/diffusers0.18.1/public_address_statement.md index 5a431f047932891c6a1a0ac1aed5e1836b059d08..da554e07cc3d85ac53cc6d49e8cccbf1af8cd1fb 100644 --- a/PyTorch/built-in/diffusion/diffusers0.18.1/public_address_statement.md +++ b/PyTorch/built-in/diffusion/diffusers0.18.1/public_address_statement.md @@ -1,1371 +1,82 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------|---------------------------------------------------------------------|--------------------------------------------------|---------| -| 开源代码引入 | https://knn.laion.ai/knn-service|.\examples\custom_diffusion\retrieve.py |https://knn.laion.ai/knn-service | 下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\custom_diffusion\train_custom_diffusion.py|https://huggingface.co | 设置说明| -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\dreambooth\train_dreambooth.py |https://huggingface.co | 设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard | \examples\custom_diffusion\train_custom_diffusion.py |https://www.tensorflow.org|设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard | \examples\dreambooth\train_dreambooth_flax.py |https://www.tensorflow.org|设置说明| -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\custom_diffusion\train_custom_diffusion.py|https://pytorch.org |设置说明| -| 开源代码引入 | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html |\examples\custom_diffusion\train_custom_diffusion.py|https://pytorch.org |设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers |\examples\dreambooth\train_dreambooth_lora.py|https://huggingface.co/ |设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers |\examples\custom_diffusion\train_custom_diffusion.py;|https://huggingface.co/ |设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint| \examples\dreambooth\train_dreambooth.py |https://huggingface.co/|设置说明| -| 开源代码引入 | https://www.crosslabs.org//blog/diffusion-with-offset-noise | \examples\dreambooth\train_dreambooth.py |https://www.crosslabs.org|设置说明| -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder | \examples\instruct_pix2pix\train_instruct_pix2pix.py |https://huggingface.co/|设置说明| -| 开源代码引入 | https://arxiv.org/abs/2211.09800 | \examples\instruct_pix2pix\train_instruct_pix2pix.py |https://arxiv.org |设置说明| -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | \examples\instruct_pix2pix\train_instruct_pix2pix.py |https://pytorch.org|设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\instruct_pix2pix\train_instruct_pix2pix.py|https://www.tensorflow.org/|设置说明| -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state|\examples\instruct_pix2pix\train_instruct_pix2pix.py|https://huggingface.co|设置说明| -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers |\examples\instruct_pix2pix\train_instruct_pix2pix.py| https://huggingface.co |设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\research_projects\colossalai\train_dreambooth_colossalai.py | https://www.tensorflow.org/ |设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint_lora.py|https://www.tensorflow.org/|设置说明| -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state|\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint_lora.py|https://huggingface.co/|设置说明| -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint.py |https://www.tensorflow.org |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state |\examples\research_projects\dreambooth_inpaint\train_dreambooth_inpaint.py |https://huggingface.co |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\research_projects\intel_opts\textual_inversion\textual_inversion_bf16.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard |\examples\research_projects\intel_opts\textual_inversion_dfq\textual_inversion.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | git+https://github.com/huggingface/peft.git |\examples\research_projects\lora\requirements.txt |git+https://github.com/huggingface/peft.git |下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder |\examples\research_projects\lora\train_text_to_image_lora.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | \examples\research_projects\lora\train_text_to_image_lora.py|https://pytorch.org |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard | \examples\research_projects\lora\train_text_to_image_lora.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state | \examples\research_projects\lora\train_text_to_image_lora.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers | \examples\research_projects\lora\train_text_to_image_lora.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\mulit_token_textual_inversion\textual_inversion_flax.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\mulit_token_textual_inversion\textual_inversion.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\research_projects\mulit_token_textual_inversion\textual_inversion.py | https://pytorch.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\mulit_token_textual_inversion\textual_inversion.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\research_projects\mulit_token_textual_inversion\textual_inversion.py| https://huggingface.co/docs/diffusers/main/en/optimization/xformers|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\multi_subject_dreambooth\train_multi_subject_dreambooth.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\research_projects\multi_subject_dreambooth\train_multi_subject_dreambooth.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\research_projects\multi_subject_dreambooth\train_multi_subject_dreambooth.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://arxiv.org/abs/2303.09556| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py| https://arxiv.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py | https://pytorch.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\research_projects\onnxruntime\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py |https://pytorch.org |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py | https://huggingface.co/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers|\examples\research_projects\onnxruntime\textual_inversion\textual_inversion.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder|\examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard|\examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py | https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\research_projects\onnxruntime\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\text_to_image\train_text_to_image_flax.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\text_to_image\train_text_to_image_flax.py|https://www.tensorflow.org |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder|\examples\text_to_image\train_text_to_image_lora.py | https://huggingface.co|设置说明 | -| 开源代码引入 | https://arxiv.org/abs/2303.09556|\examples\text_to_image\train_text_to_image_lora.py | https://arxiv.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices|\examples\text_to_image\train_text_to_image_lora.py | https://pytorch.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\text_to_image\train_text_to_image_lora.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\text_to_image\train_text_to_image_lora.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\text_to_image\train_text_to_image_lora.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\text_to_image\train_text_to_image.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://arxiv.org/abs/2303.09556| \examples\text_to_image\train_text_to_image.py|https://arxiv.org |设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\text_to_image\train_text_to_image.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\text_to_image\train_text_to_image.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator| \examples\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\text_to_image\train_text_to_image.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\text_to_image\train_text_to_image.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\textual_inversion\textual_inversion_flax.py | https://www.tensorflow.org |设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\textual_inversion\textual_inversion.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices| \examples\textual_inversion\textual_inversion.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\textual_inversion\textual_inversion.py|https://huggingface.co/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\textual_inversion\textual_inversion.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/datasets/image_dataset#imagefolder| \examples\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://www.tensorflow.org/tensorboard| \examples\unconditional_image_generation\train_unconditional.py| https://www.tensorflow.org|设置说明 | -| 开源代码引入 | https://www.wandb.ai| \examples\unconditional_image_generation\train_unconditional.py| https://www.wandb.ai|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state| \examples\unconditional_image_generation\train_unconditional.py| https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/optimization/xformers| \examples\unconditional_image_generation\train_unconditional.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/gwf-440k.ckpt| .\scripts\convert_dance_diffusion_to_diffusers.py | https://model-server.zqevans2.workers.dev | 下载权重 -| 开源代码引入 | https://model-server.zqevans2.workers.dev/jmann-small-190k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/jmann-large-580k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/maestro-uncond-150k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/unlocked-uncond-250k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev/|下载权重 | -| 开源代码引入 | https://model-server.zqevans2.workers.dev/honk-140k.ckpt| \scripts\convert_dance_diffusion_to_diffusers.py| https://model-server.zqevans2.workers.dev|下载权重 | -| 开源代码引入 | https://dl.fbaipublicfiles.com/DiT/models/{model_name} | \scripts\convert_dit_to_diffusers.py| https://dl.fbaipublicfiles.com|下载权重 | -| 开源代码引入 | https://huggingface.co/kakaobrain/karlo-v1-alpha/tree/main/prior| \scripts\convert_original_stable_diffusion_to_diffusers.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml| \scripts\convert_vae_pt_to_diffusers.py| https://raw.githubusercontent.com|下载配置 | -| 开源代码引入 | https://github.com/facebookresearch/xformers|\src\diffusers\models\attention_processor.py | https://github.com|设置说明 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\models\modeling_flax_utils.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/{pretrained_model_name_or_path}|\src\diffusers\models\modeling_flax_utils.py |https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/transformers/installation#offline-mode| \src\diffusers\models\modeling_flax_utils.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\models\modeling_flax_utils.py| https://huggingface.co |设置说明 | -| 开源代码引入 | https://pytorch.org/ and https://flax.readthedocs.io/en/latest/installation.html | \src\diffusers\models\modeling_pytorch_flax_utils.py| https://pytorch.org/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389| \src\diffusers\models\modeling_utils.py| https://github.com/|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg| \src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py|https://raw.githubusercontent.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py| https://github.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py| https://github.com|设置说明 | -| 开源代码引入 | https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png|\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | https://hf.co/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py| https://github.com|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png| \src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png| \src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py| https://github.com|设置说明 | -| 开源代码引入 | https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png| \src\diffusers\pipelines\controlnet\pipeline_controlnet.py| https://hf.co/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\controlnet\pipeline_controlnet.py| https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/YiYiXu/test-doc-assets/resolve/main/blog_post_cell_10_output_0.jpeg|\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py |https://huggingface.co|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | https://raw.githubusercontent.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | https://github.com/ |设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py| https://raw.githubusercontent.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py| https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png|\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | https://huggingface.co/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py| https://github.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py| https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main| \src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/valhalla/ldm-bert/blob/main/config.json| \src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py| https://huggingface.co/|配置文件| -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml|\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | https://raw.githubusercontent |配置文件 | -| 开源代码引入 | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml| \src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | https://raw.githubusercontent.com/|配置文件 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py| https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py |https://raw.githubusercontent.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png| \src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py| https://raw.githubusercontent.com/|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png| \src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py| https://raw.githubusercontent.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py|https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py|https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py| https://github.com|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py| https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 |\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | https://github.com |设置说明 | -| 开源代码引入 | https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py| https://github.com|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py |https://raw.githubusercontent.com |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 |\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | https://github.com|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/api/schedulers#implemented-schedulers| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py| https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/cat.pt|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | https://hf.co/|设置说明 | -| 开源代码引入 | https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/dog.pt| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py| https://hf.co/|设置说明 | -| 开源代码引入 | https://github.com/pix2pixzero/pix2pix-zero/raw/main/assets/test_images/cats/cat_6.png| \src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py| https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 |\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | https://github.com/|设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | https://github.com/|设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg|\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py |https://raw.githubusercontent.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/pull/254 | \src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py|https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/new|\src\diffusers\pipelines\pipeline_utils.py | https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/runwayml/stable-diffusion-inpainting| \src\diffusers\pipelines\pipeline_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/runwayml/stable-diffusion-inpainting | \src\diffusers\pipelines\pipeline_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py|\src\diffusers\schedulers\scheduling_pndm.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/installation#install-from-source|\src\diffusers\utils\__init__.py |https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co|\src\diffusers\utils\constants.py |"https://huggingface.co |设置说明 | -| 开源代码引入 | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py|\src\diffusers\utils\dynamic_modules_utils.py | https://raw.githubusercontent.com |下载依赖 | -| 开源代码引入 | https://pypi.org/pypi/diffusers/json| \src\diffusers\utils\dynamic_modules_utils.py|https://pypi.org/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/new/choose|\src\diffusers\utils\hub_utils.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/issues/new|\src\diffusers\utils\hub_utils.py | https://github.com/ |设置说明 | -| 开源代码引入 | https://huggingface.co/models|\src\diffusers\utils\hub_utils.py | https://huggingface.co/|设置说明 | -| 开源代码引入 | https://huggingface.co/{pretrained_model_name_or_path}|\src\diffusers\utils\hub_utils.py |https://huggingface.co/ |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/installation#offline-mode| \src\diffusers\utils\hub_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\utils\hub_utils.py|https://huggingface.co/ |设置说明| -| 开源代码引入 | https://github.com/google/flax|\src\diffusers\utils\import_utils.py | https://github.com|设置说明 | -| 开源代码引入 | https://pytorch.org/get-started/locally/| \src\diffusers\utils\import_utils.py| https://pytorch.org|设置说明 | -| 开源代码引入 | https://librosa.org/doc/latest/install.html| \src\diffusers\utils\import_utils.py| https://librosa.org|设置说明 | -| 开源代码引入 | https://github.com/rspeer/python-ftfy/tree/master#installing|\src\diffusers\utils\import_utils.py | https://github.com/|设置说明 | -| 开源代码引入 | https://huggingface.co/datasets/fusing/diffusers-testing/resolve/main|\src\diffusers\utils\testing_utils.py | https://huggingface.co |下载依赖 | -| 开源代码引入 | https://huggingface.co/models| \src\diffusers\configuration_utils.py| https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/{pretrained_model_name_or_path}| \src\diffusers\configuration_utils.py|https://huggingface.co |设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/installation#offline-mode| \src\diffusers\configuration_utils.py| https://huggingface.co/|设置说明 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.load_lora_weights|\src\diffusers\loaders.py |https://huggingface.co|设置说明 | -| 开源代码引入 | https://huggingface.co/|\src\diffusers\loaders.py |https://huggingface.co/ |下载依赖 | -| 开源代码引入 | huggingface.co/|\src\diffusers\loaders.py |huggingface.co/ |下载依赖 | -| 开源代码引入 | hf.co/|\src\diffusers\loaders.py|hf.co/ |下载依赖 | -| 开源代码引入 | https://hf.co/|\src\diffusers\loaders.py |https://hf.co/ |下载依赖 | -| 开源代码引入 | https://huggingface.co/{ckpt_name}|\utils\check_config_docstrings.py |https://huggingface.co/ |下载依赖 | -| 开源代码引入 | https://huggingface\.co/.+?)\|\utils\check_config_docstrings.py | https://huggingface.co|下载依赖 | -| 开源代码引入 | git+https://github.com/huggingface/doc-builder| \utils\check_repo.py | git+https://github.com/huggingface/doc-builder|下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/main/model_doc|\utils\release.py | https://huggingface.co/|下载依赖 | -| 开源代码引入 | https://huggingface.co/docs/diffusers/model_doc| \utils\release.py| https://huggingface.co|下载依赖 | -| 开源代码引入 | https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md|\utils\stale.py |https://github.com/ |设置说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers|\CITATION.cff | https://github.com/|配置设置 | -| 开源代码引入 | https://github.com/huggingface/diffusers|\setup.py | https://github.com/ |配置设置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/deepfloyd_if/watermark.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/watermark.py | https://github.com/deep-floyd/IF/blob/b77482e36ca2031cb94dbca1001fc1e6400bf4ab/deepfloyd_if/modules/base.py#L287 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/lpw_stable_diffusion_onnx.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://raw.githubusercontent.com/microsoft/VQ-Diffusion/main/configs/ithq.yaml | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion.py | https://github.dev/crowsonkb/k-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/vq_model.py | https://arxiv.org/abs/2112.10752 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/custom_diffusion/train_custom_diffusion.py | diffusers0.18.1/examples/custom_diffusion/train_custom_diffusion.py | https://github.com/huggingface/diffusers/blob/main/examples/custom_diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://github.com/python-pillow/Pillow/issues/5610 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/pdf/2210.11427.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/text_to_image/train_text_to_image_lora.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/vermeer_canny_edged.png | 图片地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/pndm/pipeline_pndm.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/fusing/karlo-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/bit_diffusion.py | diffusers0.18.1/examples/community/bit_diffusion.py | https://github.com/pytorch/pytorch/issues/27072 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_unclip.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/configuration_utils.py | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://github.com/huggingface/diffusers/pull/3129 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/src/diffusers/loaders.py | https://pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/textual_inversion/textual_inversion_flax.py | diffusers0.18.1/examples/research_projects/mulit_token_textual_inversion/textual_inversion_flax.py | https://github.com/deepmind/optax/issues/159#issuecomment-896459491 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/reproducibility.md | diffusers0.18.1/src/diffusers/utils/testing_utils.py | https://pytorch.org/docs/stable/notes/randomness.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://arxiv.org/abs/2205.09991 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://huggingface.co/diffusers/installation.html#offline-mode | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/prior_transformer.py | https://arxiv.org/abs/2204.06125 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://github.com/openai/guided-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/loaders.py | http://hostname | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_unidiffuser_to_diffusers.py | diffusers0.18.1/scripts/convert_unidiffuser_to_diffusers.py | https://github.com/thu-ml/unidiffuser/blob/main/configs/sample_unidiffuser_v1.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/text_inpainting.py | https://huggingface.co/docs/transformers/model_doc/clipseg | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_condition_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://arxiv.org/abs/2112.05682 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_karras_ve_flax.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.18.1/src/diffusers/models/attention_flax.py | https://github.com/AminRezaei0x443/memory-efficient-attention | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion_xl/watermark.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/watermark.py | https://github.com/Stability-AI/generative-models/blob/613af104c6b85184091d42d374fef420eddb356d/scripts/demo/streamlit_helpers.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_controlnet.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/configuration_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/t2i_adapter/train_t2i_adapter_sdxl.py | diffusers0.18.1/examples/controlnet/train_controlnet.py | https://huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/configuration_utils.py | diffusers0.18.1/src/diffusers/configuration_utils.py | https://github.com/huggingface/diffusers/pull/3129 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/attention_processor.py | https://facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://huggingface.co/fusing/karlo-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/seed_resize_stable_diffusion.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/pdf/2210.00939.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/pipelines/pndm/pipeline_pndm.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/tiled_upscaling.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/pipelines/repaint/pipeline_repaint.py | https://arxiv.org/pdf/2201.09865.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm.py | https://arxiv.org/abs/2202.09778 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_utils.py | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://huggingface.co/models?filter=ldmbert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/_config.py | diffusers0.18.1/docs/source/_config.py | https://github.com/huggingface/diffusers.git | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_repaint.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_pytorch_utils.py | diffusers0.18.1/src/diffusers/models/modeling_flax_pytorch_utils.py | https://github.com/patil-suraj/stable-diffusion-jax/blob/main/stable_diffusion_jax/convert_diffusers_to_jax.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/lpw_stable_diffusion_onnx.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/research_projects/intel_opts/textual_inversion_dfq/textual_inversion.py | https://github.com/fadel/pytorch_ema/blob/master/torch_ema/ema.py#L14 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_karras_ve.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_flax.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_controlnet.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/unclip_text_interpolation.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.18.1/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/imagic_stable_diffusion.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://github.com/justinpinkney/stable-diffusion/blob/main/notebooks/imagic.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/utils/check_table.py | diffusers0.18.1/utils/check_table.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_heun_discrete.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_flax_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/text_inpainting.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion_uncond/pipeline_latent_diffusion_uncond.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_lms_discrete.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/5b3af030dd83e0297272d861c19477735d0317ec/k_diffusion/sampling.py#L188 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/weighted_prompts.md | diffusers0.18.1/src/diffusers/utils/testing_utils.py | https://github.com/damian0815/compel | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.18.1/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_discrete.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/text_inpainting.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_repaint.py | https://arxiv.org/pdf/2201.09865.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/laion/clap-htsat-unfused | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/dreambooth/train_dreambooth.py | diffusers0.18.1/examples/dreambooth/train_dreambooth_lora.py | https://dreambooth.github.io/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/pdf/2303.06555.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/custom_pipeline_overview.md | diffusers0.18.1/src/diffusers/models/controlnet.py | https://arxiv.org/abs/2302.05543 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/utils/check_config_docstrings.py | diffusers0.18.1/utils/check_config_docstrings.py | https://huggingface.co/bert-base-uncased | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_pytorch_flax_utils.py | diffusers0.18.1/src/diffusers/models/modeling_pytorch_flax_utils.py | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_flax_pytorch_utils.py#L224-L352 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/modeling_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png | 图片地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/lora.py | diffusers0.18.1/src/diffusers/loaders.py | https://github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png | 图片地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/lpw_stable_diffusion_onnx.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/img2img_inpainting.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_tensorrt_txt2img.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://pypi.ngc.nvidia.com | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/text_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/lllyasviel/control_v11p_sd15_inpaint | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/unconditional_image_generation/train_unconditional.py | diffusers0.18.1/examples/unconditional_image_generation/train_unconditional.py | https://github.com/huggingface/accelerate/pull/962/files | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://github.com/Mikubill/sd-webui-controlnet/discussions/1236","https://github.com/Mikubill/sd-webui-controlnet/discussions/1280 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/bit_diffusion.py | diffusers0.18.1/examples/community/bit_diffusion.py | https://github.com/lucidrains/bit-diffusion/blob/main/bit_diffusion/bit_diffusion.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/unclip_image_interpolation.py | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModelWithProjection | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm_flax.py | https://arxiv.org/abs/2202.09778 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm_flax.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unclip/text_proj.py | https://arxiv.org/abs/2204.06125 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://github.com/thu-ml/unidiffuser/blob/main/libs/uvit_multi_post_ln_v1.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://huggingface.co/docs/datasets/dataset_script | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/consistency_models/pipeline_consistency_models.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/multicontrolnet.py | https://hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/abs/2303.08084 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/clip_guided_images_mixing_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_shap_e_to_diffusers.py | diffusers0.18.1/scripts/convert_shap_e_to_diffusers.py | https://openaipublic.azureedge.net/main/shap-e/text_cond.pt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/consistency_models/pipeline_consistency_models.py | https://arxiv.org/pdf/2303.01469 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.load_from_disk | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg | 图片地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion_uncond/pipeline_latent_diffusion_uncond.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/modeling_flax_utils.py | http://hostname | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://github.com/Mikubill/sd-webui-controlnet/discussions/1236","https://github.com/Mikubill/sd-webui-controlnet/discussions/1280 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_repaint.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/loaders.py | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/WarriorMama777/OrangeMixs/blob/main/Models/AbyssOrangeMix/AbyssOrangeMix.safetensors | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://arxiv.org/abs/2205.09991 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_tensorrt_txt2img.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://pypi.ngc.nvidia.com | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_vqvae.pth?sv=2020-10-02&st=2022-05-30T15%3A17%3A18Z&se=2030-05-31T15%3A17%3A00Z&sr=b&sp=r&sig=1jVavHFPp | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2302.04867 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.18.1/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/2112.05682v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | diffusers0.18.1/src/diffusers/models/controlnet.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/custom_diffusion/train_custom_diffusion.py | https://www.cs.cmu.edu/~custom-diffusion | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/text_inpainting.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://huggingface.co/openai/clip-vit-base-patch32 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_flax.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/lpw_stable_diffusion_onnx.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_condition_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_blocks_flax.py | https://arxiv.org/abs/2112.05682 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/ddim_noise_comparative_analysis.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/mixture_canvas.py | diffusers0.18.1/examples/community/mixture_canvas.py | https://github.com/huggingface/diffusers/blob/1138d63b519e37f0ce04e027b9f4a3261d27c628/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py#L44 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/experimental/rl/value_guided_sampling.py | https://github.com/jannerm/diffuser | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPImageProcessor | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/docs/transformers/main/model_doc/clap#transformers.ClapTextModelWithProjection | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/text_inpainting.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_utils_flax.py | https://huggingface.co/transformers/installation.html#offline-mode | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/85626483eaca9f581e2a78d31ff905ca/prior-ckpt-step%3D01000000-of-01000000.ckpt | 预训练模型 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/lambdalabs/sd-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json | 相关配置 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/examples/community/bit_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/text_to_image/train_text_to_image_lora.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_pytorch_utils.py | diffusers0.18.1/src/diffusers/models/modeling_flax_pytorch_utils.py | https://github.com/huggingface/transformers/blob/c603c80f46881ae18b2ca50770ef65fa4033eacd/src/transformers/modeling_flax_pytorch_utils.py#L69 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_utils_flax.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://github.com/huggingface/transformers/blob/d92e22d1f28324f513f3080e5c47c071a3916721/src/transformers/models/clip/modeling_clip.py#L1052-L1053 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://github.com/crowsonkb/k-diffusion/blob/41b4cb6df0506694a7776af31349acf082bf6091/k_diffusion/sampling.py#L543 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_3d_condition.py | diffusers0.18.1/src/diffusers/models/unet_3d_condition.py | https://huggingface.co/blog/reformer#2-chunked-feed-forward-layers | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/CompVis/latent-diffusion/raw/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/img2img_inpainting.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/dreambooth/train_dreambooth.py | diffusers0.18.1/examples/dreambooth/train_dreambooth.py | https://dreambooth.github.io/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/mixture_tiling.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/src/diffusers/training_utils.py | https://pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/import_utils.py | diffusers0.18.1/src/diffusers/utils/import_utils.py | https://github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L338 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.18.1/src/diffusers/models/attention.py | https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.18.1/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/clip_guided_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/setup.py | diffusers0.18.1/setup.py | https://test.pypi.org/legacy/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/autoencoder_kl.py | https://arxiv.org/abs/2112.10752 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_utils.py | diffusers0.18.1/src/diffusers/models/modeling_flax_utils.py | https://github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_heun_discrete.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_unidiffuser_to_diffusers.py | diffusers0.18.1/scripts/convert_unidiffuser_to_diffusers.py | https://huggingface.co/thu-ml/unidiffuser-v1 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/tiled_upscaling.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/text_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/img2img_inpainting.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://github.com/openai/guided-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/pndm/pipeline_pndm.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/clip_guided_images_mixing_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/diffusers/using-diffusers/custom_pipeline_overview | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/score_sde_ve/pipeline_score_sde_ve.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_sde_vp.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://github.com/huggingface/diffusers/pull/3533 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_consistency_models.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_consistency_models.py | https://github.com/openai/consistency_models/blob/main/cm/karras_diffusion.py#L675 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/tiled_upscaling.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_sde.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_karras_ve_flax.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/thibaud/controlnet-canny-sd21 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://arxiv.org/pdf/2303.06555.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/tiled_upscaling.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/dpm_discrete_ancestral.md | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://github.com/crowsonkb/k-diffusion | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_diffusers_to_original_stable_diffusion.py | diffusers0.18.1/scripts/convert_diffusers_to_original_stable_diffusion.py | https://github.com/pytorch/pytorch/blob/master/test/cpp/api/modules.cpp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_mega.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py | https://hf.co/datasets/diffusers/docs-images/resolve/main/shap-e/corgi.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/utils/dynamic_modules_utils.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/hub/security-tokens | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/speech_to_image_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/testing_utils.py | diffusers0.18.1/src/diffusers/utils/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/clip_guided_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2206.00927","https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/custom_diffusion/train_custom_diffusion.py | diffusers0.18.1/examples/custom_diffusion/train_custom_diffusion.py | https://github.com/huggingface/diffusers/blob/main/examples/textual_inversion/textual_inversion.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/abs/2211.05105 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/5b3af030dd83e0297272d861c19477735d0317ec/k_diffusion/sampling.py#L188 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_heun_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L90 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/text_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm_flax.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.XLMRobertaTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_flax.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/tiled_upscaling.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/deis.md | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://github.com/qsh-zh/deis | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://arxiv.org/abs/2303.13439 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_euler_discrete_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/src/diffusers/training_utils.py | https://github.com/fadel/pytorch_ema/blob/master/torch_ema/ema.py#L14 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2204.13902","https://github.com/qsh-zh/deis | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/optimization.py | diffusers0.18.1/src/diffusers/optimization.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/img2img_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kandinsky_to_diffusers.py | diffusers0.18.1/scripts/convert_kandinsky_to_diffusers.py | https://huggingface.co/ai-forever/Kandinsky_2.1/blob/main/prior_fp16.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://huggingface.co/docs/transformers/main/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_blocks_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_blocks_flax.py | https://arxiv.org/abs/2103.06104 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/unet_2d_condition_flax.py | diffusers0.18.1/src/diffusers/models/attention_flax.py | https://arxiv.org/abs/2112.05682 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/resnet.py | diffusers0.18.1/src/diffusers/models/resnet.py | https://github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/models/multi_modal/video_synthesis/unet_sd.py#L1016 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://arxiv.org/abs/2305.10853 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_vq_diffusion_to_diffusers.py | diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://github.com/microsoft/VQ-Diffusion/blob/3c98e77f721db7c787b76304fa2c96a36c7b00af/inference_VQ_Diffusion.py#L65 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_controlnet_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/paint_by_example/pipeline_paint_by_example.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://github.com/princeton-vl/RAFT/blob/master/core/utils/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://arxiv.org/abs/2305.16317 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/examples/community/clip_guided_images_mixing_stable_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://huggingface.co/docs/diffusers/main/en/using-diffusers/contribute_pipeline | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior_emb2emb.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_sde_ve_flax.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/resnet.py | diffusers0.18.1/src/diffusers/models/resnet.py | https://github.com/huggingface/diffusers/issues/984 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_ancestral_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L72 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://facebookresearch.github.io/xformers/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_consistency_models.py | https://arxiv.org/pdf/2303.01469 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://en.wikipedia.org/wiki/Slerp | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion_img2img.py | https://github.com/Jack000/glid-3-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/embeddings.py | diffusers0.18.1/src/diffusers/models/embeddings.py | https://github.com/deep-floyd/IF/blob/2f91391f27dd3c468bf174be5805b4cc92980c0b/deepfloyd_if/model/nn.py#L54 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2205.09991 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_flax.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://github.com/openai/guided-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/tiled_upscaling.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/unclip_text_interpolation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://github.com/baofff/U-ViT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/import_utils.py | diffusers0.18.1/src/diffusers/utils/import_utils.py | https://github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L319 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_flax_utils.py | diffusers0.18.1/src/diffusers/models/modeling_flax_utils.py | https://github.com/google/flax/issues/1261 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/lora.py | diffusers0.18.1/src/diffusers/models/attention_processor.py | https://github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://huggingface.co/datasets/patrickvonplaten/images/resolve/main/aa_xl/000000009.png | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_controlnet_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/dpm_discrete_ancestral.md | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://github.com/crowsonkb/k-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://github.com/LuChengTHU/dpm-solver | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png | 图片地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_mega.py | diffusers0.18.1/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion#diffusers.StableDiffusionPipeline | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/testing_utils.py | diffusers0.18.1/src/diffusers/utils/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_euler_discrete.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/686dbad0f39640ea25c8a8c6a6e56bb40eacefa2/k_diffusion/sampling.py#L17 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/unclip_image_interpolation.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModelWithProjection | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/pipeline_flax_utils.py | http://hostname | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_inverse.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/unclip_text_interpolation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/pipeline_flax_utils.py | https://huggingface.co/diffusers/installation.html#offline-mode | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/controlling_generation.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/abs/2302.03027 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_reference.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/abs/2210.05559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/testing_utils.py | diffusers0.18.1/src/diffusers/utils/testing_utils.py | https://github.com/huggingface/transformers/blob/3658488ff77ff8d45101293e749263acf437f4d5/src/transformers/testing_utils.py#L1787 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/ko/using-diffusers/controlling_generation.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/abs/2210.00939 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion_img2img.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_model_editing.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://github.com/baofff/U-ViT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.XLMRobertaTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/unclip_image_interpolation.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/setup.py | diffusers0.18.1/setup.py | patrick@huggingface.co | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_processor.py | diffusers0.18.1/src/diffusers/models/attention_processor.py | https://arxiv.org/abs/2209.09002 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2302.04867","https://github.com/wl-zhao/UniPC | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | https://github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ipndm.py | https://arxiv.org/abs/2202.09778 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://github.com/hojonathanho/diffusion | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/mixture_tiling.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_pix2pix_zero.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_zero.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_ve_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_sde_ve.py | https://arxiv.org/abs/2011.13456 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/text_inpainting.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/consistency_models/pipeline_consistency_models.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://arxiv.org/abs/2112.10752 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/abs/2206.00927","https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/img2img_inpainting.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/controlnet_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/schedulers/ipndm.md | diffusers0.18.1/src/diffusers/schedulers/scheduling_ipndm.py | https://github.com/crowsonkb/v-diffusion-pytorch/blob/987f8985e38208345c1959b0ea767a625831cc9b/diffusion/sampling.py#L296 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/repaint/pipeline_repaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/stable_diffusion_ipex.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_reference.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/ddim/pipeline_ddim.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior_emb2emb.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_vq_diffusion.py | https://arxiv.org/abs/2111.14822 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/stable_diffusion_mega.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stochastic_karras_ve/pipeline_stochastic_karras_ve.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/unconditional_image_generation/train_unconditional.py | diffusers0.18.1/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://github.com/huggingface/accelerate/pull/962/files | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/mbart | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/repaint/pipeline_repaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_discrete.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_vp.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_sde_ve.py | https://github.com/yang-song/score_sde_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm_flax.py | https://github.com/CompVis/latent-diffusion/pull/51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm_parallel.py | https://github.com/ermongroup/ddim | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/t5_film_transformer.py | diffusers0.18.1/src/diffusers/models/attention.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_mega.py | diffusers0.18.1/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/diffusers/api/pipelines/stable_diffusion#diffusers.StableDiffusionImg2ImgPipeline | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | https://github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_utils_flax.py | http://hostname | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/unconditional_image_generation/train_unconditional.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_mega.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/modeling_utils.py | diffusers0.18.1/src/diffusers/configuration_utils.py | https://pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/t5_film_transformer.py | https://arxiv.org/abs/1910.07467 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/models/modeling_pytorch_flax_utils.py | https://pytorch.org/","https://flax.readthedocs.io/en/latest/installation.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/4226b831ae0279020d134281f3c31590/improved-sr-ckpt-step%3D1.2M.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/0b62380a75e56f073e2844ab5199153d/ViT-L-14_stats.th | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://raw.githubusercontent.com/microsoft/VQ-Diffusion/main/OUTPUT/pretrained_model/taming_dvae/config.yaml | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion.py | https://arxiv.org/pdf/2010.02502.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/attention_flax.py | diffusers0.18.1/src/diffusers/models/attention_flax.py | https://arxiv.org/pdf/1506.02025.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_uvit.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://arxiv.org/pdf/2301.12247.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_heun_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_latent_upscale.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_flax_utils.py | https://huggingface.co/docs/hub/security-tokens | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lcm.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_flax.py | https://github.com/pesser/pytorch_diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/imagic_stable_diffusion.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://arxiv.org/pdf/2210.09276.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://huggingface.co/docs/transformers/model_doc/bert | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_inpainting_superresolution.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/docs/transformers/main/en/model_doc/speecht5#transformers.SpeechT5HifiGan | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/resnet.py | diffusers0.18.1/src/diffusers/models/resnet.py | https://github.com/pytorch/pytorch/issues/86679 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_learnable.pth?sv=2020-10-02&st=2022-05-30T10%3A22%3A06Z&se=2030-05-31T10%3A22%3A00Z&sr=b&sp=r&sig=GOE%2 | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/tiled_upscaling.py | diffusers0.18.1/examples/community/tiled_upscaling.py | peter@codebuffet.co | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_lms_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard","https://www.wandb.ai | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/unclip_text_interpolation.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_consistency_to_diffusers.py | diffusers0.18.1/scripts/convert_consistency_to_diffusers.py | https://stackoverflow.com/questions/15008758/parsing-boolean-values-with-argparse | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png | 图片地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_lms_discrete.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/community/text_inpainting.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_attend_and_excite.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_controlnet.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_original_audioldm_to_diffusers.py | diffusers0.18.1/scripts/convert_original_audioldm_to_diffusers.py | https://huggingface.co/spaces/haoheliu/audioldm-text-to-audio-generation/blob/84a0384742a22bd80c44e903e241f0623e874f1d/audioldm/utils.py#L72-L73 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/loaders.py | https://github.com/huggingface/diffusers/pull/3490#issuecomment-1555059060 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/textual_inversion/textual_inversion_flax.py | diffusers0.18.1/examples/textual_inversion/textual_inversion_flax.py | https://github.com/deepmind/optax/issues/159#issuecomment-896459491 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/ddim/pipeline_ddim.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/controlnet/test_controlnet_img2img.py | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://github.com/haofanwang/ControlNet-for-Diffusers/ | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/instruct_pix2pix/train_instruct_pix2pix_sdxl.py | diffusers0.18.1/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://huggingface.co/docs/datasets/main/en/image_load#imagefolder | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/","https://hf.co/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/loaders.py | diffusers0.18.1/src/diffusers/loaders.py | https://github.com/huggingface/diffusers/pull/2918 | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard","https://www.wandb.ai | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_repaint.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://arxiv.org/pdf/2201.09865.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/controlnet/train_controlnet_flax.py | diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://github.com/borisdayma/dalle-mini/blob/d2be512d4a6a9cda2d63ba04afc33038f98f705f/src/dalle_mini/data.py#L370 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_consistency_models.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/checkpoint_merger.py | diffusers0.18.1/examples/community/checkpoint_merger.py | https://github.com/huggingface/diffusers/issues/877 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_inpaint.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_euler_discrete.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_kakao_brain_unclip_to_diffusers.py | diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/efdf6206d8ed593961593dc029a8affa/decoder-ckpt-step%3D01000000-of-01000000.ckpt | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm_flax.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior_emb2emb.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/t2i_adapter/train_t2i_adapter_sdxl.py | diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/ddim_noise_comparative_analysis.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/consistency_models/pipeline_consistency_models.py | diffusers0.18.1/src/diffusers/pipelines/consistency_models/pipeline_consistency_models.py | https://github.com/openai/consistency_models/blob/main/scripts/launch.sh#L77 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/lpw_stable_diffusion_onnx.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://imagen.research.google/video/paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://facebookresearch.github.io/xformers/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | diffusers0.18.1/src/diffusers/models/unet_3d_condition.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/multilingual_stable_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/scripts/convert_unidiffuser_to_diffusers.py | https://huggingface.co/gpt2/blob/main/config.json | 相关配置 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion_img2img.py | https://github.dev/crowsonkb/k-diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/ddpm/pipeline_ddpm.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_discrete.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/dit/pipeline_dit.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | diffusers0.18.1/src/diffusers/models/unet_2d_condition_flax.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/loaders.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_vp.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_sde_vp.py | https://github.com/yang-song/score_sde_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_prior_emb2emb.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/scripts/convert_stable_diffusion_controlnet_to_onnx.py | diffusers0.18.1/scripts/convert_stable_diffusion_checkpoint_to_onnx.py | https://github.com/huggingface/transformers/pull/18515/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/tiled_upscaling.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/setup.py | https://testpypi.python.org/pypi | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_sag.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_controlnet_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/t5_film_transformer.py | diffusers0.18.1/src/diffusers/models/t5_film_transformer.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/pipeline_unidiffuser.py | https://huggingface.co/docs/transformers/model_doc/gpt2#transformers.GPT2Tokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/text_to_image/train_text_to_image_flax.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/utils/stale.py | diffusers0.18.1/utils/stale.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_upscale.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/unclip_text_interpolation.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/README.md | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | https://github.com/huggingface/diffusers/tree/main/examples/community | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://huggingface.co/docs/transformers/model_doc/roberta#transformers.RobertaTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/seed_resize_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/img2img_inpainting.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_pndm.py | https://github.com/CompVis/latent-diffusion/pull/51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_k_dpm_2_ancestral_discrete.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/modeling_text_unet.py | diffusers0.18.1/src/diffusers/models/controlnet_flax.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/unclip_image_interpolation.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPVisionModelWithProjection | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2204.13902 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_diffedit.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/tests/pipelines/stable_diffusion/test_stable_diffusion.py | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned-emaonly.ckpt | 预训练模型 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unclip/pipeline_unclip_image_variation.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_controlnet.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_dual_guided.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_karras_ve.py | https://arxiv.org/abs/2206.00364 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/setup.py | diffusers0.18.1/setup.py | https://github.com/allenai/allennlp/blob/main/setup.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_prior.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/research_projects/mulit_token_textual_inversion/multi_token_clip.py | diffusers0.18.1/examples/research_projects/mulit_token_textual_inversion/multi_token_clip.py | https://github.com/rinongal/textual_inversion/pull/119 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_singlestep.py | https://arxiv.org/pdf/2206.00364.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/vq_diffusion/pipeline_vq_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/text_to_image/train_text_to_image.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_image_variation.py | https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg | 图片地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/loaders.py | https://civitai.com/models/3036?modelVersionId=9857 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/training_utils.py | diffusers0.18.1/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/text_to_video_synthesis/pipeline_text_to_video_synth.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://arxiv.org/pdf/2305.08891.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_comparison.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/unidiffuser/modeling_text_decoder.py | https://arxiv.org/pdf/2303.06555.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_lms_discrete_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_lms_discrete.py | https://github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_paradigms.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/docs/source/en/api/pipelines/vq_diffusion.md | diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://github.com/microsoft/VQ-Diffusion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/pipeline_utils.py | http://hostname | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://github.com/CompVis/taming-transformers | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_upscale.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_pndm_flax.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ipndm.py | https://arxiv.org/pdf/2202.09778.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/audioldm/pipeline_audioldm.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/controlnet_flax.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.RobertaSeriesModelWithTransformation | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.RobertaSeriesModelWithTransformation | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/controlnet/pipeline_flax_controlnet.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.FlaxCLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/unclip_text_interpolation.py | https://en.wikipedia.org/wiki/Slerp | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/lpw_stable_diffusion.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddim_parallel.py | https://arxiv.org/abs/2210.05559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_inpainting.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_inpaint.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/stable_diffusion_tensorrt_txt2img.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://pypi.ngc.nvidia.com | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/checkpoint_merger.py | diffusers0.18.1/examples/community/checkpoint_merger.py | https://en.wikipedia.org/wiki/Smoothstep | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_panorama.py | https://arxiv.org/abs/2302.08113 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_deis_multistep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_unipc_multistep.py | https://arxiv.org/abs/2205.11487 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unclip.py | diffusers0.18.1/examples/community/bit_diffusion.py | https://arxiv.org/pdf/2006.11239.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/clip_guided_stable_diffusion_img2img.py | diffusers0.18.1/examples/community/clip_guided_stable_diffusion.py | https://github.com/Jack000/glid-3-xl | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/wildcard_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/stable_diffusion_repaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/shap_e/pipeline_shap_e.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/stable_diffusion_controlnet_inpaint_img2img.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/imagic_stable_diffusion.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_img2img_superresolution.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/utils/dynamic_modules_utils.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/utils/dynamic_modules_utils.py | diffusers0.18.1/src/diffusers/loaders.py | https://huggingface.co/docs/hub/models-gated#gated-models | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/latent_diffusion/pipeline_latent_diffusion.py | https://huggingface.co/docs/transformers/model_doc/bert#transformers.BertTokenizer | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/mixture_canvas.py | https://en.wikipedia.org/wiki/Kernel_ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_instruct_pix2pix.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_img2img.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_unipc_multistep.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_ddpm.py | https://arxiv.org/abs/2305.08891 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/src/diffusers/pipelines/semantic_stable_diffusion/pipeline_semantic_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/research_projects/lora/train_text_to_image_lora.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint_legacy.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddpm_wuerstchen.py | diffusers0.18.1/src/diffusers/models/embeddings_flax.py | https://arxiv.org/abs/2006.11239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/speech_to_image_diffusion.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/examples/community/img2img_inpainting.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_unclip_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_img2img.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/examples/community/img2img_inpainting.py | https://arxiv.org/abs/2207.12598 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/examples/community/composable_stable_diffusion.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/shap_e/renderer.py | diffusers0.18.1/src/diffusers/pipelines/shap_e/renderer.py | https://arxiv.org/pdf/2210.04628.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://huggingface.co/thibaud/controlnet-sd21/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/alt_diffusion/pipeline_alt_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/score_sde_ve/pipeline_score_sde_ve.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/vae_flax.py | diffusers0.18.1/src/diffusers/models/vae_flax.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_inpaint.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/community/sd_text2img_k_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_img2img.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/wuerstchen/text_to_image/train_text_to_image_prior.py | diffusers0.18.1/examples/text_to_image/train_text_to_image_lora.py | https://huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_inverse.py | https://arxiv.org/abs/2206.00927 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl.py | https://huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/.github/PULL_REQUEST_TEMPLATE.md | diffusers0.18.1/docs/source/en/using-diffusers/using_safetensors | https://github.com/huggingface/safetensors | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/dit/pipeline_dit.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion_text_to_image.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/dance_diffusion/pipeline_dance_diffusion.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/examples/community/img2img_inpainting.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/wuerstchen/pipeline_wuerstchen_prior.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/ddpm/pipeline_ddpm.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_onnx_stable_diffusion_inpaint_legacy.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/models/embeddings.py | diffusers0.18.1/src/diffusers/models/embeddings.py | https://arxiv.org/abs/2102.12092 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/pipeline_utils.py | diffusers0.18.1/src/diffusers/models/modeling_utils.py | https://hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_ddim_parallel.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_xl/pipeline_stable_diffusion_xl_img2img.py | https://arxiv.org/abs/2010.02502 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_ldm3d.py | https://huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/README.md | diffusers0.18.1/examples/community/interpolate_stable_diffusion.py | https://huggingface.co/CompVis/stable-diffusion-v1-4 | 相关说明 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion_safe/pipeline_stable_diffusion_safe.py | https://arxiv.org/pdf/2205.11487.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/schedulers/scheduling_sde_vp.py | diffusers0.18.1/src/diffusers/schedulers/scheduling_sde_ve_flax.py | https://github.com/yang-song/score_sde_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/versatile_diffusion/pipeline_versatile_diffusion.py | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://huggingface.co/openai/clip-vit-large-patch14 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/src/diffusers/pipelines/stable_diffusion/README.md | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_flax_stable_diffusion_inpaint.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/diffusers/examples/community/pipeline_zero1to3.py | diffusers0.18.1/src/diffusers/pipelines/deepfloyd_if/pipeline_if_superresolution.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/cross_attention.py | 源码实现 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/schedulers/scheduling_dpmsolver_multistep_flax.py | https://arxiv.org/abs/2206.00927","https://arxiv.org/abs/2211.01095 | 论文地址 | -| 开发引入 | / | diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_k_diffusion.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | diffusers0.18.1/examples/research_projects/lora/requirements.txt | https://github.com/huggingface/peft.git | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/.github/actions/setup-miniconda/action.yml | https://repo.anaconda.com/miniconda/Miniconda3-py39_${MINICONDA_VERSION}-${MINICONDA_ARCH}.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/.github/workflows/nightly_tests.yml | https://download.pytorch.org/whl/cpu | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/community/clip_guided_stable_diffusion_img2img.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/community/stable_diffusion_tensorrt_img2img.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/community/stable_diffusion_tensorrt_inpaint.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/community/stable_diffusion_tensorrt_txt2img.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/controlnet/train_controlnet.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/controlnet/train_controlnet_flax.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/custom_diffusion/train_custom_diffusion.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/custom_diffusion/train_custom_diffusion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/custom_diffusion/train_custom_diffusion.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth.py | https://www.crosslabs.org//blog/ | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth_flax.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth_lora.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/dreambooth/train_dreambooth_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/instruct_pix2pix/train_instruct_pix2pix.py | https://arxiv.org/abs/2211.09800 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/colossalai/train_dreambooth_colossalai.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/dreambooth_inpaint/train_dreambooth_inpaint.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/dreambooth_inpaint/train_dreambooth_inpaint_lora.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/intel_opts/textual_inversion/textual_inversion_bf16.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/intel_opts/textual_inversion_dfq/textual_inversion.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/lora/train_text_to_image_lora.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/lora/train_text_to_image_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/mulit_token_textual_inversion/textual_inversion.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/mulit_token_textual_inversion/textual_inversion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/mulit_token_textual_inversion/textual_inversion_flax.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/multi_subject_dreambooth/train_multi_subject_dreambooth.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/multi_subject_dreambooth/train_multi_subject_dreambooth.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/multi_subject_dreambooth/train_multi_subject_dreambooth.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/text_to_image/train_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/textual_inversion/textual_inversion.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/textual_inversion/textual_inversion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/research_projects/onnxruntime/unconditional_image_generation/train_unconditional.py | https://www.wandb.ai | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/pretrain_text_to_image.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/pretrain_text_to_image.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/pretrain_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image_flax.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image_lora.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image_lora.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/text_to_image/train_text_to_image_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/textual_inversion/textual_inversion.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/textual_inversion/textual_inversion.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/textual_inversion/textual_inversion_flax.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/unconditional_image_generation/train_unconditional.py | https://www.tensorflow.org/tensorboard | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/examples/unconditional_image_generation/train_unconditional.py | https://www.wandb.ai | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_dit_to_diffusers.py | https://dl.fbaipublicfiles.com/DiT/models/{model_name} | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/efdf6206d8ed593961593dc029a8affa/decoder-ckpt-step%3D01000000-of-01000000.ckpt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/85626483eaca9f581e2a78d31ff905ca/prior-ckpt-step%3D01000000-of-01000000.ckpt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/4226b831ae0279020d134281f3c31590/improved-sr-ckpt-step%3D1.2M.ckpt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_kakao_brain_unclip_to_diffusers.py | https://arena.kakaocdn.net/brainrepo/models/karlo-public/v1.0.0.alpha/0b62380a75e56f073e2844ab5199153d/ViT-L-14_stats.th | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_shap_e_to_diffusers.py | https://openaipublic.azureedge.net/main/shap-e/text_cond.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_vae_pt_to_diffusers.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_vqvae.pth?sv=2020-10-02&st=2022-05-30T15%3A17%3A18Z&se=2030-05-31T15%3A17%3A00Z&sr=b&sp=r&sig=1jVavHFPpUjDs%2FTO1V3PTezaNbPp2Nx8MxiWI7y6fEY%3D -O ithq_vqvae.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/scripts/convert_vq_diffusion_to_diffusers.py | https://facevcstandard.blob.core.windows.net/v-zhictang/Improved-VQ-Diffusion_model_release/ithq_learnable.pth?sv=2020-10-02&st=2022-05-30T10%3A22%3A06Z&se=2030-05-31T10%3A22%3A00Z&sr=b&sp=r&sig=GOE%2Bza02%2FPnGxYVOOPtwrTR4RA3%2F5NVgMxdW4kjaEZ8%3D -O ithq_learnable.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/setup.py | patrick@huggingface.co | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/models/modeling_pytorch_flax_utils.py | httphttps://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_depth2img.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/utils/dynamic_modules_utils.py | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/utils/dynamic_modules_utils.py | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/utils/import_utils.py | https://librosa.org/doc/latest/install.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.18.1/src/diffusers/utils/import_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/diffusers0.21.0/public_address_statement.md b/PyTorch/built-in/diffusion/diffusers0.21.0/public_address_statement.md index 93d7662a2f14d7a12a1271c95793b9a20f194886..255835c7ff79a067ae4742658b52dfca6c92683b 100644 --- a/PyTorch/built-in/diffusion/diffusers0.21.0/public_address_statement.md +++ b/PyTorch/built-in/diffusion/diffusers0.21.0/public_address_statement.md @@ -1,1707 +1,51 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------|---------------------------------------------------------------------|--------------------------------------------------|---------| -| 开源代码引入 | .\CITATION.cff | repository-code 'https //github.com/huggingface/diffusers' | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | "See https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" //huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" //pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " more information see https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" //huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | # https //huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # see more https //github.com/python-pillow/Pillow/issues/5610 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | "Folder must contain a dataset script as described here https://huggingface.co/docs/datasets/dataset_script) ." //huggingface.co/docs/datasets/dataset_script) ." | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | "See more https://huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.load_from_disk" //huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.load_from_disk" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # https //github.com/borisdayma/dalle-mini/blob/d2be512d4a6a9cda2d63ba04afc33038f98f705f/src/dalle_mini/data.py#L370 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # https //huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | help="Path to an improved VAE to stabilize training. For more details check out: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | "See https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" //huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" //pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " more information see https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" //huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # https //huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # details https //github.com/huggingface/diffusers/pull/4038#discussion_r1266078401 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | This is a dreambooth model derived from {base_model}. The weights were trained on {prompt} using [DreamBooth](https //dreambooth.github.io/). | 下载预训练模型 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | "See https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" //huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " See Accelerator::save_state https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state" save_state https | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" //pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " See: https://www.crosslabs.org//blog/diffusion-with-offset-noise for more information." https //www.crosslabs.org//blog/diffusion-with-offset-noise for more information." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_flax.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | These are LoRA adaption weights for {base_model}. The weights were trained on {prompt} using [DreamBooth](https //dreambooth.github.io/). You can find some example images in the following. \n | 下载预训练模型 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | These are LoRA adaption weights for {base_model}. The weights were trained on {prompt} using [DreamBooth](https //dreambooth.github.io/). You can find some example images in the following. \n | 下载预训练模型 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | help="Path to pretrained VAE model with better numerical stability. More details: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | " more information see https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" //huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_flax.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_flax.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_flax.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | help="Path to pretrained VAE model with better numerical stability. More details: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | help="Path to pretrained VAE model with better numerical stability. More details: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # details https //github.com/huggingface/diffusers/pull/4038#discussion_r1266078401 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\setup.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\setup.py | Simple check list from AllenNLP repo https //github.com/allenai/allennlp/blob/main/setup.py | 问题引导 | -| 开源代码引入 | .\setup.py | twine upload dist/* -r pypitest --repository-url=https //test.pypi.org/legacy/ | 问题引导 | -| 开源代码引入 | .\setup.py | pip install -i https //testpypi.python.org/pypi diffusers | 问题引导 | -| 开源代码引入 | .\setup.py | pip install -i https //testpypi.python.org/pypi diffusers | 问题引导 | -| 开源代码引入 | .\setup.py | url="https://github.com/huggingface/diffusers", //github.com/huggingface/diffusers", | 问题引导 | -| 开源代码引入 | .\setup.py | # twine upload dist/* -r pypitest --repository-url=https //test.pypi.org/legacy/ | 问题引导 | -| 开源代码引入 | .\setup.py | # pip install -i https //testpypi.python.org/pypi diffusers | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\diffusers_cli.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\env.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\fp16_safetensors.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\fp16_safetensors.py | " CLI](https://github.com/huggingface/diffusers/blob/main/src/diffusers/commands/fp16_safetensors.py)." //github.com/huggingface/diffusers/blob/main/src/diffusers/commands/fp16_safetensors.py)." | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | config attributes directly. See https //github.com/huggingface/diffusers/pull/3129 | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | https //pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | " listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a" //huggingface.co/models'\nIf this is a private repository, make sure to pass a" | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | f" 'https://huggingface.co/{pretrained_model_name_or_path}' for available revisions." //huggingface.co/{pretrained_model_name_or_path}' for available revisions." | 下载预训练模型 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | " 'https://huggingface.co/docs/diffusers/installation#offline-mode'." //huggingface.co/docs/diffusers/installation#offline-mode'." | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | "'https://huggingface.co/models', make sure you don't have a local directory with the same name. " //huggingface.co/models', make sure you don't have a local directory with the same name. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\dependency_versions_check.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\image_processor.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [`attention_processor.py`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py) | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | dict](https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # See https //github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | warn_message = "The state_dict contains LoRA params corresponding to the text encoder which are not being used here. To use both UNet and text encoder related LoRA params, use [`pipe.load_lora_weights()`](https://huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.load_lora_weights)." //huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.load_lora_weights)." | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | dict](https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | (for example from [civitAI](https //civitai.com/models/3036?modelVersionId=9857)) and then load the vector | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [`CLIPTextModel`](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | dict](https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # If the serialization format is new (introduced in https //github.com/huggingface/diffusers/pull/2918), | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # If the serialization format is new (introduced in https //github.com/huggingface/diffusers/pull/2918), | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [`pipe.fuse_lora()`](https //huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.fuse_lora). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | `"https://huggingface.co//blob/main/.ckpt"`) on the Hub. //huggingface.co//blob/main/.ckpt"`) on the Hub. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. If this | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | ... "https://huggingface.co/WarriorMama777/OrangeMixs/blob/main/Models/AbyssOrangeMix/AbyssOrangeMix.safetensors" //huggingface.co/WarriorMama777/OrangeMixs/blob/main/Models/AbyssOrangeMix/AbyssOrangeMix.safetensors" | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | ... "https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned-emaonly.ckpt", //huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned-emaonly.ckpt", | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | valid_url_prefixes = ["https://huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"] //huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"] //hf.co/"] | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | `"https://huggingface.co//blob/main/.ckpt"`) on the Hub. //huggingface.co//blob/main/.ckpt"`) on the Hub. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | Image Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | url = "https://huggingface.co/stabilityai/sd-vae-ft-mse-original/blob/main/vae-ft-mse-840000-ema-pruned.safetensors" # can also be local file //huggingface.co/stabilityai/sd-vae-ft-mse-original/blob/main/vae-ft-mse-840000-ema-pruned.safetensors" # can also be local file | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | for prefix in ["https://huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //hf.co/"]: | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | config_url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" //raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | `"https://huggingface.co//blob/main/.ckpt"`) on the Hub. //huggingface.co//blob/main/.ckpt"`) on the Hub. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | url = "https://huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11p_sd15_canny.pth" # can also be a local path //huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11p_sd15_canny.pth" # can also be a local path | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | url = "https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned.safetensors" # can also be a local path //huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned.safetensors" # can also be a local path | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | for prefix in ["https://huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //hf.co/"]: | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | config_url = "https://raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml" //raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | [Adapter](https //github.com/TencentARC/T2I-Adapter/blob/686de4681515662c0ac2ffa07bf5dda83af1038a/ldm/modules/encoders/adapter.py#L97) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | [AdapterLight](https //github.com/TencentARC/T2I-Adapter/blob/686de4681515662c0ac2ffa07bf5dda83af1038a/ldm/modules/encoders/adapter.py#L235). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention.py | A variant of the gated linear unit activation function from https //arxiv.org/abs/2002.05202. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention.py | For more details, see section 2 https //arxiv.org/abs/1606.08415 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | Flax Memory-efficient multi-head dot product attention. https //arxiv.org/abs/2112.05682v2 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //github.com/AminRezaei0x443/memory-efficient-attention | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | A Flax multi-head attention module as described in https //arxiv.org/abs/1706.03762 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/abs/1706.03762 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/pdf/1506.02025.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/abs/2002.05202 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/abs/2002.05202. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | "Refer to https://github.com/facebookresearch/xformers for more information on how to install" //github.com/facebookresearch/xformers for more information on how to install" | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to use | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | Spatially conditioned normalization as defined in https //arxiv.org/abs/2209.09002 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_asym_kl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_asym_kl.py | Designing a Better Asymmetric VQGAN for StableDiffusion https //arxiv.org/abs/2306.04632 . A VAE model with KL loss | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_asym_kl.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_kl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_kl.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_kl.py | `force_upcast` can be set to `False` - see https //huggingface.co/madebyollin/sdxl-vae-fp16-fix | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_tiny.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_tiny.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. For this Autoencoder, | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_tiny.py | [AutoEncoder](https //huggingface.co/madebyollin/sdxl-vae-fp16-fix)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet.py | Quoting from https //arxiv.org/abs/2302.05543 "Stable Diffusion uses a pre-processing method similar to VQ-GAN | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | This model is also a Flax Linen [`flax.linen.Module`](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\dual_transformer_2d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings.py | For more details, see figure 10 of the dall-e paper https //arxiv.org/abs/2102.12092 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings.py | # Copied from https //github.com/deep-floyd/IF/blob/2f91391f27dd3c468bf174be5805b4cc92980c0b/deepfloyd_if/model/nn.py#L54 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings_flax.py | Wrapper Module for sinusoidal Time step Embeddings as described in https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # See https //github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # # see https //github.com/bmaltais/kohya_ss/blob/2accb1305979ba62f5077a23aabac23b4c37e935/networks/lora_diffusers.py#L129 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # See https //github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # see https //github.com/huggingface/diffusers/pull/4315 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_pytorch_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_pytorch_utils.py | # Adapted from https //github.com/huggingface/transformers/blob/c603c80f46881ae18b2ca50770ef65fa4033eacd/src/transformers/modeling_flax_pytorch_utils.py#L69 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_pytorch_utils.py | # and https //github.com/patil-suraj/stable-diffusion-jax/blob/main/stable_diffusion_jax/convert_diffusers_to_jax.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | # taken from https //github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | "listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a " //huggingface.co/models'\nIf this is a private repository, make sure to pass a " | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | f"'https://huggingface.co/{pretrained_model_name_or_path}' for available revisions." //huggingface.co/{pretrained_model_name_or_path}' for available revisions." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | " 'https://huggingface.co/docs/transformers/installation#offline-mode'." //huggingface.co/docs/transformers/installation#offline-mode'." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | "'https://huggingface.co/models', make sure you don't have a local directory with the same name. " //huggingface.co/models', make sure you don't have a local directory with the same name. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | # https //github.com/google/flax/issues/1261 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_pytorch_flax_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_pytorch_flax_utils.py | # from https //github.com/huggingface/transformers/blob/main/src/transformers/modeling_flax_pytorch_utils.py#L224-L352 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_pytorch_flax_utils.py | " https://pytorch.org/ and https://flax.readthedocs.io/en/latest/installation.html for installation" //pytorch.org/ and https //flax.readthedocs.io/en/latest/installation.html for installation" | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | f"Cannot load {model_name_or_path_str}because {param_name} expected shape {empty_state_dict[param_name]}, but got {param.shape}. If you want to instead overwrite randomly initialized weights, please make sure to pass both `low_cpu_mem_usage=False` and `ignore_mismatched_sizes=True`. For more information, see also: https://github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389 as an example." https //github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389 as an example." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | config attributes directly. See https //github.com/huggingface/diffusers/pull/3129 We need to overwrite | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | https //pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | # call PyTorch's https //pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | Enable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | [`memory_efficient_attention()`](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | Disable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | ["offline-mode"](https://huggingface.co/diffusers/installation.html#offline-mode) to use this method in a //huggingface.co/diffusers/installation.html#offline-mode) to use this method in a | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\prior_transformer.py | https //arxiv.org/abs/2204.06125 If it is `None`, no additional embeddings will be prepended. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | # https //github.com/pytorch/pytorch/issues/86679 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | # upsample_nearest_nhwc fails with large batch sizes. see https //github.com/huggingface/diffusers/issues/984 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | # upsample_nearest_nhwc fails with large batch sizes. see https //github.com/huggingface/diffusers/issues/984 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | https //github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/models/multi_modal/video_synthesis/unet_sd.py#L1016 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\t5_film_transformer.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\t5_film_transformer.py | # Square Layer Normalization https //arxiv.org/abs/1910.07467 thus variance is calculated | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\t5_film_transformer.py | the Gaussian Error Linear Units paper https //arxiv.org/abs/1606.08415 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\transformer_2d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\transformer_temporal.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_1d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_1d_blocks.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | https //arxiv.org/abs/2103.06104 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | https //arxiv.org/abs/2103.06104 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | Cross Attention 2D Mid-level block - original architecture from Unet transformers https //arxiv.org/abs/2103.06104 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | This model is also a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | Enable memory efficient attention as described [here](https //arxiv.org/abs/2112.05682). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_blocks.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | chunking](https //huggingface.co/blog/reformer#2-chunked-feed-forward-layers). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | # JAX implementation of VQGAN from taming-transformers https //github.com/CompVis/taming-transformers | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | This model is a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | This model is a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | This model is a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vq_model.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vq_model.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\optimization.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\optimization.py | https //github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf). Guidance rescale factor should fix overexposure when | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | [laion/clap-htsat-unfused](https //huggingface.co/laion/clap-htsat-unfused) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\modeling_audioldm2.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\modeling_audioldm2.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\modeling_audioldm2.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [CLAP](https //huggingface.co/docs/transformers/model_doc/clap#transformers.CLAPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | specifically the [laion/clap-htsat-unfused](https //huggingface.co/laion/clap-htsat-unfused) variant. The | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [T5](https //huggingface.co/docs/transformers/model_doc/t5#transformers.T5EncoderModel), specifically the | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [google/flan-t5-large](https //huggingface.co/google/flan-t5-large) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\mel.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\mel.py | An audio file that must be on disk due to [Librosa](https //librosa.org/) limitation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | An audio file that must be on disk due to [Librosa](https //librosa.org/) limitation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) used to denoise. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\consistency_models\pipeline_consistency_models.py | >>> # https //github.com/openai/consistency_models/blob/main/scripts/launch.sh#L77 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\consistency_models\pipeline_consistency_models.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\multicontrolnet.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # This model implementation is heavily inspired by https //github.com/haofanwang/ControlNet-for-Diffusers/ | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ([runwayml/stable-diffusion-inpainting](https //huggingface.co/runwayml/stable-diffusion-inpainting)) as well as | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ([runwayml/stable-diffusion-v1-5](https //huggingface.co/runwayml/stable-diffusion-v1-5)). Default text-to-image | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | [lllyasviel/control_v11p_sd15_inpaint](https //huggingface.co/lllyasviel/control_v11p_sd15_inpaint). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | ... "https://hf.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd_controlnet/hf-logo.png" //hf.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd_controlnet/hf-logo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | ([laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | Whether to use the [invisible_watermark](https //github.com/ShieldMnt/invisible-watermark/) library to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | ... "https://huggingface.co/datasets/YiYiXu/test-doc-assets/resolve/main/blog_post_cell_10_output_0.jpeg" //huggingface.co/datasets/YiYiXu/test-doc-assets/resolve/main/blog_post_cell_10_output_0.jpeg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dance_diffusion\pipeline_dance_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dance_diffusion\pipeline_dance_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddim\pipeline_ddim.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddim\pipeline_ddim.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddim\pipeline_ddim.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddpm\pipeline_ddpm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddpm\pipeline_ddpm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\watermark.py | # copied from https //github.com/deep-floyd/IF/blob/b77482e36ca2031cb94dbca1001fc1e6400bf4ab/deepfloyd_if/modules/base.py#L287 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dit\pipeline_dit.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dit\pipeline_dit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | "THIS means that you HAVE to invert the input mask to have the same behavior as before as explained in https://github.com/huggingface/diffusers/pull/4207. " //github.com/huggingface/diffusers/pull/4207. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | "THIS means that you HAVE to invert the input mask to have the same behavior as before as explained in https://github.com/huggingface/diffusers/pull/4207. " //github.com/huggingface/diffusers/pull/4207. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | # See all LDMBert models at https //huggingface.co/models?filter=ldmbert | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | "ldm-bert": "https://huggingface.co/valhalla/ldm-bert/blob/main/config.json", "https://huggingface.co/valhalla/ldm-bert/blob/main/config.json", //huggingface.co/valhalla/ldm-bert/blob/main/config.json", | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | ... "https://user-images.githubusercontent.com/38061659/199705896-b48e17b8-b231-47cd-a270-4ffa5a93fa3e.png" //user-images.githubusercontent.com/38061659/199705896-b48e17b8-b231-47cd-a270-4ffa5a93fa3e.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion_uncond\pipeline_latent_diffusion_uncond.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion_uncond\pipeline_latent_diffusion_uncond.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | [laion/clap-htsat-unfused](https //huggingface.co/laion/clap-htsat-unfused) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\onnx_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\image_encoder.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | ... "https://raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/image/example_1.png" //raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/image/example_1.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | ... "https://raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/mask/example_1.png" //raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/mask/example_1.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | >>> example_url = "https://raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/reference/example_1.jpg" //raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/reference/example_1.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | [鈥渙ffline-mode鈥漖(https //huggingface.co/diffusers/installation.html#offline-mode) to use this method in a | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | >>> # see more in [the documentation](https //huggingface.co/docs/hub/security-tokens) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | f"You are loading the variant {revision} from {pretrained_model_name_or_path} via `revision='{revision}'`. This behavior is deprecated and will be removed in diffusers v1. One should use `variant='{revision}'` instead. However, it appears that {pretrained_model_name_or_path} currently does not have the required variant filenames in the 'main' branch. \n The Diffusers team and community would be very grateful if you could open an issue: https://github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {revision} files' so that the correct variant file can be added.", https //github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {revision} files' so that the correct variant file can be added.", | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | [Community](https //github.com/huggingface/diffusers/tree/main/examples/community). Valid file | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | Pipelines](https //huggingface.co/docs/diffusers/using-diffusers/custom_pipeline_overview) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | >>> # of the documentation](https //huggingface.co/docs/hub/security-tokens) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | " checkpoint: https://huggingface.co/runwayml/stable-diffusion-inpainting instead or adapting your" https //huggingface.co/runwayml/stable-diffusion-inpainting instead or adapting your" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | " https://huggingface.co/runwayml/stable-diffusion-inpainting. Note that we do not actively maintain" //huggingface.co/runwayml/stable-diffusion-inpainting. Note that we do not actively maintain" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | [Community](https //github.com/huggingface/diffusers/tree/main/examples/community). Valid file | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | community pipeline](https //huggingface.co/docs/diffusers/main/en/using-diffusers/contribute_pipeline). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | Enable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). When this | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | [`memory_efficient_attention()`](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | Disable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pndm\pipeline_pndm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pndm\pipeline_pndm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pndm\pipeline_pndm.py | # the official paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | RePaint paper). Take a look at Figure 9 and 10 in the [paper](https //arxiv.org/pdf/2201.09865.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | and 10 in the [paper](https //arxiv.org/pdf/2201.09865.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | >>> img_url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/celeba_hq_256.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/celeba_hq_256.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | >>> mask_url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/mask_256.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/mask_256.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\score_sde_ve\pipeline_score_sde_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\score_sde_ve\pipeline_score_sde_ve.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\camera.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e_img2img.py | >>> image_url = "https://hf.co/datasets/diffusers/docs-images/resolve/main/shap-e/corgi.png" //hf.co/datasets/diffusers/docs-images/resolve/main/shap-e/corgi.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\renderer.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\renderer.py | Reference https //arxiv.org/pdf/2210.04628.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\continous_encoder.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\midi_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\notes_encoder.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\pipeline_spectrogram_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\pipeline_spectrogram_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\pipeline_spectrogram_diffusion.py | >>> # Download MIDI from wget http //www.piano-midi.de/midis/beethoven/beethoven_hammerklavier_2.mid | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\clip_image_project_model.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # model components i.e. https //huggingface.co/thibaud/controlnet-sd21/ | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | An instance of [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | to use, specifically the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # "state_dict" key https://huggingface.co/thibaud/controlnet-canny-sd21 //huggingface.co/thibaud/controlnet-canny-sd21 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" //raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml" //raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml" //raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml" //raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # "state_dict" key https://huggingface.co/thibaud/controlnet-canny-sd21 //huggingface.co/thibaud/controlnet-canny-sd21 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | url = "https://raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/An%20astronaut%20riding%20a%20horse.png" //raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/An%20astronaut%20riding%20a%20horse.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # See more samples at the original repo https //github.com/ChenWu98/cycle-diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | "https://raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/A%20black%20colored%20car.png" //raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/A%20black%20colored%20car.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | >>> img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | >>> mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Corresponds to parameter eta (?) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | # eta corresponds to ? in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf). Guidance rescale factor should fix overexposure when | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | >>> url = "http://images.cocodataset.org/val2017/000000039769.jpg" //images.cocodataset.org/val2017/000000039769.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | >>> img_url = "https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" //github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | >>> img_url = "https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" //github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Generation](https //arxiv.org/pdf/2301.07093.pdf). Scheduled Sampling factor is only varied for | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf). Guidance rescale factor should fix overexposure when | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/backpack.jpeg" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/backpack.jpeg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/pexels-pixabay-60597.jpg" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/pexels-pixabay-60597.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Frozen image-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Generation](https //arxiv.org/pdf/2301.07093.pdf). Scheduled Sampling factor is only varied for | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Frozen CLIP image-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | [`CLIPImageProcessor`](https //huggingface.co/lambdalabs/sd-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | url = "https://lh3.googleusercontent.com/y-iFOHfLTwkuQSUegpwDdgKmOjRSTvPxat63dQLB25xkTs4lhIbRUFeNBWZzYf370g=s1200" //lh3.googleusercontent.com/y-iFOHfLTwkuQSUegpwDdgKmOjRSTvPxat63dQLB25xkTs4lhIbRUFeNBWZzYf370g=s1200" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | >>> img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | >>> mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | "by loading your model into `StableDiffusionInpaintPipeline` instead. See https://github.com/huggingface/diffusers/pull/3533" //github.com/huggingface/diffusers/pull/3533" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | >>> img_url = "https://huggingface.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" //huggingface.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | " as defined in https://huggingface.co/docs/diffusers/api/schedulers#implemented-schedulers for" //huggingface.co/docs/diffusers/api/schedulers#implemented-schedulers for" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Apply model editing via closed-form solution (see Eq. 5 in the TIME [paper](https //arxiv.org/abs/2303.08084)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # Here, we define the mappings F_i (see Eq. 7 in the MultiDiffusion paper https //arxiv.org/abs/2302.08113) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # MultiDiffusion paper for more details https //arxiv.org/abs/2302.08113 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # take the MultiDiffusion step. Eq. 5 in MultiDiffusion paper https //arxiv.org/abs/2302.08113 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | >>> source_emb_url = "https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/cat.pt" //hf.co/datasets/sayakpaul/sample-datasets/resolve/main/cat.pt" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | >>> target_emb_url = "https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/dog.pt" //hf.co/datasets/sayakpaul/sample-datasets/resolve/main/dog.pt" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | >>> img_url = "https://github.com/pix2pixzero/pix2pix-zero/raw/main/assets/test_images/cats/cat_6.png" //github.com/pix2pixzero/pix2pix-zero/raw/main/assets/test_images/cats/cat_6.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | paper](https //arxiv.org/abs/2302.03027). Used in discovering the edit direction. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | paper](https //arxiv.org/abs/2302.03027). Used in discovering the edit direction. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # Modified to get self-attention guidance scale in this paper (https //arxiv.org/pdf/2210.00939.pdf) as an input | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # of the self-attentnion guidance paper https //arxiv.org/pdf/2210.00939.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # in https //arxiv.org/pdf/2210.00939.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # Same masking process as in SAG paper https //arxiv.org/pdf/2210.00939.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | >>> url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\safety_checker.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\safety_checker_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\stable_unclip_image_normalizer.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\safety_checker.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | >>> url = "https://huggingface.co/datasets/patrickvonplaten/images/resolve/main/aa_xl/000000009.png" //huggingface.co/datasets/patrickvonplaten/images/resolve/main/aa_xl/000000009.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | >>> img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | >>> mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | ... "https://hf.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" //hf.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\watermark.py | # Copied from https //github.com/Stability-AI/generative-models/blob/613af104c6b85184091d42d374fef420eddb356d/scripts/demo/streamlit_helpers.py#L66 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stochastic_karras_ve\pipeline_stochastic_karras_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stochastic_karras_ve\pipeline_stochastic_karras_ve.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | ... "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/t2i-adapter/color_ref.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/t2i-adapter/color_ref.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | https //arxiv.org/abs/2302.08453 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | >>> sketch_image = load_image("https://huggingface.co/Adapter/t2iadapter/resolve/main/sketch.png").convert("L") //huggingface.co/Adapter/t2iadapter/resolve/main/sketch.png").convert("L") | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | https //arxiv.org/abs/2302.08453 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # This code is copied from https //github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # This code is copied from https //github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | # Adapted from https //github.com/princeton-vl/RAFT/blob/master/core/utils/utils.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Strength of motion in generated video along x-axis. See the [paper](https //arxiv.org/abs/2303.13439), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Strength of motion in generated video along y-axis. See the [paper](https //arxiv.org/abs/2303.13439), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | [paper](https //arxiv.org/abs/2303.13439), Sect. 3.3.1. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | [paper](https //arxiv.org/abs/2303.13439), Sect. 3.3.1. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | Frozen CLIP image-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | [configuration](https //huggingface.co/fusing/karlo-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\text_proj.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\text_proj.py | For more details, see the original paper https //arxiv.org/abs/2204.06125 section 2.1 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_text_decoder.py | # Modified from ClipCaptionModel in https //github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_text_decoder.py | Text decoder model for a image-text [UniDiffuser](https //arxiv.org/pdf/2303.06555.pdf) model. This is used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_text_decoder.py | code](https //github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py#L89). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | # Method based on https //people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | implementation](https //github.com/thu-ml/unidiffuser/blob/main/libs/uvit_multi_post_ln_v1.py#L104). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | # https //github.com/baofff/U-ViT | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | Transformer model based on the [U-ViT](https //github.com/baofff/U-ViT) architecture for image-like data. Compared | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | Transformer model for a image-text [UniDiffuser](https //arxiv.org/pdf/2303.06555.pdf) model. This is a | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | A [U-ViT](https //github.com/baofff/U-ViT) model with UNNet-style skip connections between transformer | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | [UniDiffuser-v1](https //huggingface.co/thu-ml/unidiffuser-v1) checkpoint. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\modeling_text_unet.py | " https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing" //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\modeling_text_unet.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | Frozen text-encoder ([clip-vit-base-patch32](https //huggingface.co/openai/clip-vit-base-patch32)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | # https //github.com/huggingface/transformers/blob/d92e22d1f28324f513f3080e5c47c071a3916721/src/transformers/models/clip/modeling_clip.py#L1052-L1053 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_paella_vq_model.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_common.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_common.py | # from https //github.com/facebookresearch/ConvNeXt-V2/blob/3608f67cc1dae164790c5d0aead7bf2d73d9719b/models/utils.py#L105 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_diffnext.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_prior.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipeline_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | [paper](https //huggingface.co/papers/2206.00364). Defaults to 0.5 from the original implementation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | [paper](https //huggingface.co/papers/2206.00364). Defaults to 7.0 from the original implementation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | # See https //github.com/openai/consistency_models/blob/main/cm/karras_diffusion.py#L675 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | [paper](https //huggingface.co/papers/2303.01469)) to enforce boundary condition. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | Rescales betas to have zero terminal SNR Based on https //arxiv.org/pdf/2305.08891.pdf (Algorithm 1) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | [`--offset_noise`](https //github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # 7. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | For more details, see the original paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # 5. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # 6. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | Rescales betas to have zero terminal SNR Based on https //arxiv.org/pdf/2305.08891.pdf (Algorithm 1) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | [`--offset_noise`](https //github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # "leading" and "trailing" corresponds to annotation of Table 1. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # 5. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # 6. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | Rescales betas to have zero terminal SNR Based on https //arxiv.org/pdf/2305.08891.pdf (Algorithm 1) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | For more details, see the original paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | whether to use the "dynamic thresholding" method (introduced by Imagen, https://arxiv.org/abs/2205.11487). //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | (https //arxiv.org/abs/2205.11487). Valid only when `thresholding=True`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | Steps are Flawed](https //arxiv.org/abs/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | whether to rescale the betas to have zero terminal SNR (proposed by https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | [`--offset_noise`](https //github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | CycleDiffusion. (https //arxiv.org/abs/2210.05559) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 7. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 7. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # for rl-diffuser https //arxiv.org/abs/2205.09991 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | For more details, see the original paper https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # for rl-diffuser https //arxiv.org/abs/2205.09991 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | For more details, see the original paper https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | whether to use the "dynamic thresholding" method (introduced by Imagen, https://arxiv.org/abs/2205.11487). //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | (https //arxiv.org/abs/2205.11487). Valid only when `thresholding=True`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | Steps are Flawed](https //arxiv.org/abs/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # for rl-diffuser https //arxiv.org/abs/2205.09991 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_wuerstchen.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_wuerstchen.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_wuerstchen.py | For more details, see the original paper https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # DISCLAIMER check https //arxiv.org/abs/2204.13902 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # The codebase is modified based on https //github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | `dpmsolver` type implements the algorithms in the [DPMSolver](https //huggingface.co/papers/2206.00927) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | [DPMSolver++](https //huggingface.co/papers/2211.01095) paper. It is recommended to use `dpmsolver++` or | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | For more details, see the original paper https //arxiv.org/abs/2206.00927 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | We also support the "dynamic thresholding" method in Imagen (https://arxiv.org/abs/2205.11487). For pixel-space //arxiv.org/abs/2205.11487). For pixel-space | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | For more details, see the original paper https //arxiv.org/abs/2206.00927 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | whether to use the "dynamic thresholding" method (introduced by Imagen, https://arxiv.org/abs/2205.11487). //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | (https //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | algorithms in https //arxiv.org/abs/2206.00927, and the `dpmsolver++` type implements the algorithms in | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | https //arxiv.org/abs/2211.01095. We recommend to use `dpmsolver++` with `solver_order=2` for guided | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # Dynamic thresholding in https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | See https //arxiv.org/abs/2206.00927 for the detailed derivation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | `dpmsolver` type implements the algorithms in the [DPMSolver](https //huggingface.co/papers/2206.00927) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | [DPMSolver++](https //huggingface.co/papers/2211.01095) paper. It is recommended to use `dpmsolver++` or | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | Generative Models](https //huggingface.co/papers/2206.00364) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | `dpmsolver` type implements the algorithms in the [DPMSolver](https //huggingface.co/papers/2206.00927) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | [DPMSolver++](https //huggingface.co/papers/2211.01095) paper. It is recommended to use `dpmsolver++` or | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | # Copied from https //github.com/crowsonkb/k-diffusion/blob/686dbad0f39640ea25c8a8c6a6e56bb40eacefa2/k_diffusion/sampling.py#L17 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ipndm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ipndm.py | # For more information on the algorithm please take a look at the paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve.py | For more details on the parameters, see [Appendix E](https //arxiv.org/abs/2206.00364). The grid search values used | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | https //arxiv.org/abs/2206.00364 [2] Song, Yang, et al. "Score-based generative modeling through stochastic | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | differential equations." https://arxiv.org/abs/2011.13456 //arxiv.org/abs/2011.13456 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | Diffusion-Based Generative Models." https://arxiv.org/abs/2206.00364. The grid search values used to find the //arxiv.org/abs/2206.00364. The grid search values used to find the | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | the Design Space of Diffusion-Based Generative Models](https //huggingface.co/papers/2206.00364) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | Diffusion-Based Generative Models](https //huggingface.co/papers/2206.00364) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete_flax.py | https //github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete_flax.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | or `v_prediction` (see section 2.4 of [Imagen Video](https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # For more information on the algorithm please take a look at the paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # is based on crowsonkb's PLMS sampler implementation https //github.com/CompVis/latent-diffusion/pull/51 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | "See: https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py " https //github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py " | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # See formula (9) of PNDM paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | For more details, see the original paper https //arxiv.org/abs/2202.09778 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # For more information on the algorithm please take a look at the paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # is based on crowsonkb's PLMS sampler implementation https //github.com/CompVis/latent-diffusion/pull/51 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # See formula (9) of PNDM paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # https //arxiv.org/pdf/2006.11239.pdf) and sample from it to get | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # Is equivalent to formula (16) in https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # from https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 7. compute x_{t-1} of formula (12) from https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 8. Algorithm 1 Line 5 https //arxiv.org/pdf/2201.09865.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 9. Algorithm 1 Line 8 https //arxiv.org/pdf/2201.09865.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 10. Algorithm 1 Line 10 https //arxiv.org/pdf/2201.09865.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve.py | # DISCLAIMER This file is strongly influenced by https //github.com/yang-song/score_sde_pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/yang-song/score_sde_pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve_flax.py | For more information, see the original paper https //arxiv.org/abs/2011.13456 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_vp.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_vp.py | # DISCLAIMER This file is strongly influenced by https //github.com/yang-song/score_sde_pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # DISCLAIMER check https //arxiv.org/abs/2302.04867 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # The codebase is modified based on https //github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | ["offline-mode"](https://huggingface.co/diffusers/installation.html#offline-mode) to use this method in a //huggingface.co/diffusers/installation.html#offline-mode) to use this method in a | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | models](https //huggingface.co/docs/hub/models-gated#gated-models). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | Activate the special ["offline-mode"](https://huggingface.co/transformers/installation.html#offline-mode) to //huggingface.co/transformers/installation.html#offline-mode) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_vq_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\training_utils.py | # Adapted from torch-ema https //github.com/fadel/pytorch_ema/blob/master/torch_ema/ema.py#L14 | 问题引导 | -| 开源代码引入 | .\src\diffusers\training_utils.py | # https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\accelerate_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\constants.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\constants.py | HUGGINGFACE_CO_RESOLVE_ENDPOINT = "https://huggingface.co" //huggingface.co" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\doc_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | "https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py" //raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | url = "https://pypi.org/pypi/diffusers/json" //pypi.org/pypi/diffusers/json" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models). | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models). | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | "file an issue at https://github.com/huggingface/diffusers/issues/new/choose, copy paste this whole " //github.com/huggingface/diffusers/issues/new/choose, copy paste this whole " | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | f"You are loading the variant {revision} from {pretrained_model_name_or_path} via `revision='{revision}'`. This behavior is deprecated and will be removed in diffusers v1. One should use `variant='{revision}'` instead. However, it appears that {pretrained_model_name_or_path} currently does not have a {_add_variant(weights_name, revision)} file in the 'main' branch of {pretrained_model_name_or_path}. \n The Diffusers team and community would be very grateful if you could open an issue: https://github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {_add_variant(weights_name, revision)}' so that the correct variant file can be added.", https //github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {_add_variant(weights_name, revision)}' so that the correct variant file can be added.", | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | "listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a " //huggingface.co/models'\nIf this is a private repository, make sure to pass a " | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | f"'https://huggingface.co/{pretrained_model_name_or_path}' for available revisions." //huggingface.co/{pretrained_model_name_or_path}' for available revisions." | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | " offline mode at 'https://huggingface.co/docs/diffusers/installation#offline-mode'." //huggingface.co/docs/diffusers/installation#offline-mode'." | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | "'https://huggingface.co/models', make sure you don't have a local directory with the same name. " //huggingface.co/models', make sure you don't have a local directory with the same name. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation page https //github.com/google/flax and follow the ones that match your environment. | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation page https //pytorch.org/get-started/locally/ and follow the ones that match your environment. | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation page https //librosa.org/doc/latest/install.html and follow the ones that match your environment. | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation section https //github.com/rspeer/python-ftfy/tree/master#installing and follow the ones | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # This function was copied from https //github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L319 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # This function was copied from https //github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L338 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # https //github.com/optuna/optuna/blob/master/optuna/integration/__init__.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\loading_utils.py | if image.startswith("http://") or image.startswith("https://"): //") or image.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\loading_utils.py | f"Incorrect path or url, URLs must start with `http://` or `https://`, and {image} is not a valid path" //` or `https //`, and {image} is not a valid path" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\logging.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\outputs.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | Decorator marking a test that requires compel https //github.com/damian0815/compel. These tests are skipped when | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | elif arry.startswith("http://") or arry.startswith("https://"): //") or arry.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | f"Incorrect path or url, URLs must start with `http://` or `https://`, and {arry} is not a valid path" //` or `https //`, and {arry} is not a valid path" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | if image.startswith("http://") or image.startswith("https://"): //") or image.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | f"Incorrect path or url, URLs must start with `http://` or `https://`, and {image} is not a valid path" //` or `https //`, and {image} is not a valid path" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | if not path.startswith("http://") or path.startswith("https://"): //") or path.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | "https://huggingface.co/datasets/fusing/diffusers-testing/resolve/main", urllib.parse.quote(path) //huggingface.co/datasets/fusing/diffusers-testing/resolve/main", urllib.parse.quote(path) | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | # adapted from https //github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | # adapted from https //github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | # Taken from https //github.com/huggingface/transformers/blob/3658488ff77ff8d45101293e749263acf437f4d5/src/transformers/testing_utils.py#L1787 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | - https //pytorch.org/docs/stable/notes/randomness.html for pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\torch_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\__init__.py | "`https://huggingface.co/docs/diffusers/installation#install-from-source`)," //huggingface.co/docs/diffusers/installation#install-from-source`)," | 问题引导 | -| 开源代码引入 | .\src\diffusers\__init__.py | # https //github.com/huggingface/transformers/blob/main/src/transformers/__init__.py | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | # For example, `[bert-base-uncased](https //huggingface.co/bert-base-uncased)` | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | _re_checkpoint = re.compile("\[(.+?)\]\((https://huggingface\.co/.+?)\)") //huggingface\.co/.+?)\)") | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | # For example, `('bert-base-uncased', 'https //huggingface.co/bert-base-uncased')` | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | ckpt_link_from_name = f"https://huggingface.co/{ckpt_name}" //huggingface.co/{ckpt_name}" | 下载预训练模型 | -| 开源代码引入 | .\utils\check_copies.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_doc_toc.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_dummies.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_inits.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_repo.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_repo.py | "(`pip install git+https://github.com/huggingface/doc-builder`)" //github.com/huggingface/doc-builder`)" | 问题引导 | -| 开源代码引入 | .\utils\check_table.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_table.py | # Thanks to https //stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 问题引导 | -| 开源代码引入 | .\utils\custom_init_isort.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\get_modified_files.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\overwrite_expected_slice.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\print_env.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\release.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\release.py | "https://huggingface.co/docs/diffusers/main/model_doc", //huggingface.co/docs/diffusers/main/model_doc", | 问题引导 | -| 开源代码引入 | .\utils\release.py | "https://huggingface.co/docs/diffusers/model_doc", //huggingface.co/docs/diffusers/model_doc", | 问题引导 | -| 开源代码引入 | .\utils\stale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\stale.py | https //github.com/allenai/allennlp. | 问题引导 | -| 开源代码引入 | .\utils\stale.py | "[contributing guidelines](https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md) " //github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md) " | 问题引导 | -| 开源代码引入 | .\_typos.toml | # Instruction https //github.com/marketplace/actions/typos-action#getting-started | 问题引导 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|--------------------------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet_flax.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet_sdxl.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/controlnet/train_controlnet_sdxl.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth.py | https://www.crosslabs.org//blog/ | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth_flax.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth_lora.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth_lora_sdxl.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/dreambooth/train_dreambooth_lora_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_flax.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_lora.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_lora.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_lora_sdxl.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_lora_sdxl.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_lora_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_sdxl.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_sdxl.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/examples/text_to_image/train_text_to_image_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/setup.py | patrick@huggingface.co | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/loaders.py | https://raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/loaders.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/models/controlnet_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/models/modeling_pytorch_flax_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/models/unet_2d_condition_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_combined.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_combined.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/stable_diffusion/pipeline_cycle_diffusion.py | https://raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/An%20astronaut%20riding%20a%20horse.png | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://lh3.googleusercontent.com/y-iFOHfLTwkuQSUegpwDdgKmOjRSTvPxat63dQLB25xkTs4lhIbRUFeNBWZzYf370g=s1200 | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/utils/dynamic_modules_utils.py | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/utils/import_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.21.0/src/diffusers/utils/import_utils.py | https://librosa.org/doc/latest/install.html | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/diffusers0.25.0/public_address_statement.md b/PyTorch/built-in/diffusion/diffusers0.25.0/public_address_statement.md index 93d7662a2f14d7a12a1271c95793b9a20f194886..06e4f751d337d5c40409e2b05bfd681e68e80869 100644 --- a/PyTorch/built-in/diffusion/diffusers0.25.0/public_address_statement.md +++ b/PyTorch/built-in/diffusion/diffusers0.25.0/public_address_statement.md @@ -1,1707 +1,54 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------|---------------------------------------------------------------------|--------------------------------------------------|---------| -| 开源代码引入 | .\CITATION.cff | repository-code 'https //github.com/huggingface/diffusers' | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | "See https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" //huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" //pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | " more information see https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" //huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | # https //huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # see more https //github.com/python-pillow/Pillow/issues/5610 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | "Folder must contain a dataset script as described here https://huggingface.co/docs/datasets/dataset_script) ." //huggingface.co/docs/datasets/dataset_script) ." | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | "See more https://huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.load_from_disk" //huggingface.co/docs/datasets/package_reference/main_classes#datasets.Dataset.load_from_disk" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # https //github.com/borisdayma/dalle-mini/blob/d2be512d4a6a9cda2d63ba04afc33038f98f705f/src/dalle_mini/data.py#L370 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | # https //huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_flax.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | help="Path to an improved VAE to stabilize training. For more details check out: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | "See https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" //huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" //pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | " more information see https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" //huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # https //huggingface.co/docs/datasets/v2.0.0/en/dataset_script | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\controlnet\train_controlnet_sdxl.py | # details https //github.com/huggingface/diffusers/pull/4038#discussion_r1266078401 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | This is a dreambooth model derived from {base_model}. The weights were trained on {prompt} using [DreamBooth](https //dreambooth.github.io/). | 下载预训练模型 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | "See https://huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" //huggingface.co/docs/diffusers/main/en/training/dreambooth#performing-inference-using-a-saved-checkpoint for step by step" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " See Accelerator::save_state https://huggingface.co/docs/accelerate/package_reference/accelerator#accelerate.Accelerator.save_state" save_state https | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" //pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | " See: https://www.crosslabs.org//blog/diffusion-with-offset-noise for more information." https //www.crosslabs.org//blog/diffusion-with-offset-noise for more information." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_flax.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | These are LoRA adaption weights for {base_model}. The weights were trained on {prompt} using [DreamBooth](https //dreambooth.github.io/). You can find some example images in the following. \n | 下载预训练模型 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | These are LoRA adaption weights for {base_model}. The weights were trained on {prompt} using [DreamBooth](https //dreambooth.github.io/). You can find some example images in the following. \n | 下载预训练模型 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | help="Path to pretrained VAE model with better numerical stability. More details: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\dreambooth\train_dreambooth_lora_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | " more information see https://huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" //huggingface.co/docs/accelerate/v0.17.0/en/package_reference/accelerator#accelerate.Accelerator" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_flax.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_flax.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_flax.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | help="Path to pretrained VAE model with better numerical stability. More details: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_lora_sdxl.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | help="Path to pretrained VAE model with better numerical stability. More details: https://github.com/huggingface/diffusers/pull/4038.", https //github.com/huggingface/diffusers/pull/4038.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | " https://huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" //huggingface.co/docs/datasets/image_dataset#imagefolder. In particular, a `metadata.jsonl` file" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | "More details here: https://arxiv.org/abs/2303.09556.", https //arxiv.org/abs/2303.09556.", | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | " https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | "[TensorBoard](https://www.tensorflow.org/tensorboard) log directory. Will default to" //www.tensorflow.org/tensorboard) log directory. Will default to" | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | "xFormers 0.0.16 cannot be used for training in some GPUs. If you observe problems during training, please update xFormers to at least 0.0.17. See https://huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." //huggingface.co/docs/diffusers/main/en/optimization/xformers for more details." | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | Computes SNR as per https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L847-L849 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # Adapted from https //github.com/TiankaiHang/Min-SNR-Diffusion-Training/blob/521b624bd70c67cee4bdf49225915f5945a872e3/guided_diffusion/gaussian_diffusion.py#L1026 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # cf https //pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # https //huggingface.co/docs/datasets/v2.4.0/en/image_load#imagefolder | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # details https //github.com/huggingface/diffusers/pull/4038#discussion_r1266078401 | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # https //www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | -| 开源代码引入 | .\examples\text_to_image\train_text_to_image_sdxl.py | # Compute loss-weights as per Section 3.4 of https //arxiv.org/abs/2303.09556. | 问题引导 | -| 开源代码引入 | .\setup.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\setup.py | Simple check list from AllenNLP repo https //github.com/allenai/allennlp/blob/main/setup.py | 问题引导 | -| 开源代码引入 | .\setup.py | twine upload dist/* -r pypitest --repository-url=https //test.pypi.org/legacy/ | 问题引导 | -| 开源代码引入 | .\setup.py | pip install -i https //testpypi.python.org/pypi diffusers | 问题引导 | -| 开源代码引入 | .\setup.py | pip install -i https //testpypi.python.org/pypi diffusers | 问题引导 | -| 开源代码引入 | .\setup.py | url="https://github.com/huggingface/diffusers", //github.com/huggingface/diffusers", | 问题引导 | -| 开源代码引入 | .\setup.py | # twine upload dist/* -r pypitest --repository-url=https //test.pypi.org/legacy/ | 问题引导 | -| 开源代码引入 | .\setup.py | # pip install -i https //testpypi.python.org/pypi diffusers | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\diffusers_cli.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\env.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\fp16_safetensors.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\fp16_safetensors.py | " CLI](https://github.com/huggingface/diffusers/blob/main/src/diffusers/commands/fp16_safetensors.py)." //github.com/huggingface/diffusers/blob/main/src/diffusers/commands/fp16_safetensors.py)." | 问题引导 | -| 开源代码引入 | .\src\diffusers\commands\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | config attributes directly. See https //github.com/huggingface/diffusers/pull/3129 | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | https //pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | " listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a" //huggingface.co/models'\nIf this is a private repository, make sure to pass a" | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | f" 'https://huggingface.co/{pretrained_model_name_or_path}' for available revisions." //huggingface.co/{pretrained_model_name_or_path}' for available revisions." | 下载预训练模型 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | " 'https://huggingface.co/docs/diffusers/installation#offline-mode'." //huggingface.co/docs/diffusers/installation#offline-mode'." | 问题引导 | -| 开源代码引入 | .\src\diffusers\configuration_utils.py | "'https://huggingface.co/models', make sure you don't have a local directory with the same name. " //huggingface.co/models', make sure you don't have a local directory with the same name. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\dependency_versions_check.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\image_processor.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [`attention_processor.py`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py) | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | dict](https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # See https //github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | warn_message = "The state_dict contains LoRA params corresponding to the text encoder which are not being used here. To use both UNet and text encoder related LoRA params, use [`pipe.load_lora_weights()`](https://huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.load_lora_weights)." //huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.load_lora_weights)." | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | dict](https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | (for example from [civitAI](https //civitai.com/models/3036?modelVersionId=9857)) and then load the vector | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [`CLIPTextModel`](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | dict](https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # If the serialization format is new (introduced in https //github.com/huggingface/diffusers/pull/2918), | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | # If the serialization format is new (introduced in https //github.com/huggingface/diffusers/pull/2918), | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [`pipe.fuse_lora()`](https //huggingface.co/docs/diffusers/main/en/api/loaders#diffusers.loaders.LoraLoaderMixin.fuse_lora). | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | `"https://huggingface.co//blob/main/.ckpt"`) on the Hub. //huggingface.co//blob/main/.ckpt"`) on the Hub. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. If this | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | ... "https://huggingface.co/WarriorMama777/OrangeMixs/blob/main/Models/AbyssOrangeMix/AbyssOrangeMix.safetensors" //huggingface.co/WarriorMama777/OrangeMixs/blob/main/Models/AbyssOrangeMix/AbyssOrangeMix.safetensors" | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | ... "https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned-emaonly.ckpt", //huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned-emaonly.ckpt", | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | valid_url_prefixes = ["https://huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"] //huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"] //hf.co/"] | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | `"https://huggingface.co//blob/main/.ckpt"`) on the Hub. //huggingface.co//blob/main/.ckpt"`) on the Hub. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | Image Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | url = "https://huggingface.co/stabilityai/sd-vae-ft-mse-original/blob/main/vae-ft-mse-840000-ema-pruned.safetensors" # can also be local file //huggingface.co/stabilityai/sd-vae-ft-mse-original/blob/main/vae-ft-mse-840000-ema-pruned.safetensors" # can also be local file | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | for prefix in ["https://huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //hf.co/"]: | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | config_url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" //raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | `"https://huggingface.co//blob/main/.ckpt"`) on the Hub. //huggingface.co//blob/main/.ckpt"`) on the Hub. | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | url = "https://huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11p_sd15_canny.pth" # can also be a local path //huggingface.co/lllyasviel/ControlNet-v1-1/blob/main/control_v11p_sd15_canny.pth" # can also be a local path | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | url = "https://huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned.safetensors" # can also be a local path //huggingface.co/runwayml/stable-diffusion-v1-5/blob/main/v1-5-pruned.safetensors" # can also be a local path | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | for prefix in ["https://huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //huggingface.co/", "huggingface.co/", "hf.co/", "https://hf.co/"]: //hf.co/"]: | 问题引导 | -| 开源代码引入 | .\src\diffusers\loaders.py | config_url = "https://raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml" //raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | [Adapter](https //github.com/TencentARC/T2I-Adapter/blob/686de4681515662c0ac2ffa07bf5dda83af1038a/ldm/modules/encoders/adapter.py#L97) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\adapter.py | [AdapterLight](https //github.com/TencentARC/T2I-Adapter/blob/686de4681515662c0ac2ffa07bf5dda83af1038a/ldm/modules/encoders/adapter.py#L235). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention.py | A variant of the gated linear unit activation function from https //arxiv.org/abs/2002.05202. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention.py | For more details, see section 2 https //arxiv.org/abs/1606.08415 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | Flax Memory-efficient multi-head dot product attention. https //arxiv.org/abs/2112.05682v2 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //github.com/AminRezaei0x443/memory-efficient-attention | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | A Flax multi-head attention module as described in https //arxiv.org/abs/1706.03762 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/abs/1706.03762 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/pdf/1506.02025.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/abs/2002.05202 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_flax.py | https //arxiv.org/abs/2002.05202. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | "Refer to https://github.com/facebookresearch/xformers for more information on how to install" //github.com/facebookresearch/xformers for more information on how to install" | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to use | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | Spatially conditioned normalization as defined in https //arxiv.org/abs/2209.09002 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\attention_processor.py | [operator](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.AttentionOpBase) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_asym_kl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_asym_kl.py | Designing a Better Asymmetric VQGAN for StableDiffusion https //arxiv.org/abs/2306.04632 . A VAE model with KL loss | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_asym_kl.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_kl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_kl.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_kl.py | `force_upcast` can be set to `False` - see https //huggingface.co/madebyollin/sdxl-vae-fp16-fix | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_tiny.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_tiny.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. For this Autoencoder, | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\autoencoder_tiny.py | [AutoEncoder](https //huggingface.co/madebyollin/sdxl-vae-fp16-fix)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet.py | Quoting from https //arxiv.org/abs/2302.05543 "Stable Diffusion uses a pre-processing method similar to VQ-GAN | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | This model is also a Flax Linen [`flax.linen.Module`](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\controlnet_flax.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\dual_transformer_2d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings.py | For more details, see figure 10 of the dall-e paper https //arxiv.org/abs/2102.12092 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings.py | # Copied from https //github.com/deep-floyd/IF/blob/2f91391f27dd3c468bf174be5805b4cc92980c0b/deepfloyd_if/model/nn.py#L54 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\embeddings_flax.py | Wrapper Module for sinusoidal Time step Embeddings as described in https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # See https //github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # # see https //github.com/bmaltais/kohya_ss/blob/2accb1305979ba62f5077a23aabac23b4c37e935/networks/lora_diffusers.py#L129 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # See https //github.com/darkstorm2150/sd-scripts/blob/main/docs/train_network_README-en.md#execute-learning | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\lora.py | # see https //github.com/huggingface/diffusers/pull/4315 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_pytorch_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_pytorch_utils.py | # Adapted from https //github.com/huggingface/transformers/blob/c603c80f46881ae18b2ca50770ef65fa4033eacd/src/transformers/modeling_flax_pytorch_utils.py#L69 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_pytorch_utils.py | # and https //github.com/patil-suraj/stable-diffusion-jax/blob/main/stable_diffusion_jax/convert_diffusers_to_jax.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | # taken from https //github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | "listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a " //huggingface.co/models'\nIf this is a private repository, make sure to pass a " | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | f"'https://huggingface.co/{pretrained_model_name_or_path}' for available revisions." //huggingface.co/{pretrained_model_name_or_path}' for available revisions." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | " 'https://huggingface.co/docs/transformers/installation#offline-mode'." //huggingface.co/docs/transformers/installation#offline-mode'." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | "'https://huggingface.co/models', make sure you don't have a local directory with the same name. " //huggingface.co/models', make sure you don't have a local directory with the same name. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_flax_utils.py | # https //github.com/google/flax/issues/1261 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_pytorch_flax_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_pytorch_flax_utils.py | # from https //github.com/huggingface/transformers/blob/main/src/transformers/modeling_flax_pytorch_utils.py#L224-L352 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_pytorch_flax_utils.py | " https://pytorch.org/ and https://flax.readthedocs.io/en/latest/installation.html for installation" //pytorch.org/ and https //flax.readthedocs.io/en/latest/installation.html for installation" | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | f"Cannot load {model_name_or_path_str}because {param_name} expected shape {empty_state_dict[param_name]}, but got {param.shape}. If you want to instead overwrite randomly initialized weights, please make sure to pass both `low_cpu_mem_usage=False` and `ignore_mismatched_sizes=True`. For more information, see also: https://github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389 as an example." https //github.com/huggingface/diffusers/issues/1619#issuecomment-1345604389 as an example." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | config attributes directly. See https //github.com/huggingface/diffusers/pull/3129 We need to overwrite | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | https //pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | # call PyTorch's https //pytorch.org/docs/stable/_modules/torch/nn/modules/module.html#Module | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | Enable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | [`memory_efficient_attention()`](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | Disable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\modeling_utils.py | ["offline-mode"](https://huggingface.co/diffusers/installation.html#offline-mode) to use this method in a //huggingface.co/diffusers/installation.html#offline-mode) to use this method in a | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\prior_transformer.py | https //arxiv.org/abs/2204.06125 If it is `None`, no additional embeddings will be prepended. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | # https //github.com/pytorch/pytorch/issues/86679 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | # upsample_nearest_nhwc fails with large batch sizes. see https //github.com/huggingface/diffusers/issues/984 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | # upsample_nearest_nhwc fails with large batch sizes. see https //github.com/huggingface/diffusers/issues/984 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet.py | https //github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/models/multi_modal/video_synthesis/unet_sd.py#L1016 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\resnet_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\t5_film_transformer.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\t5_film_transformer.py | # Square Layer Normalization https //arxiv.org/abs/1910.07467 thus variance is calculated | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\t5_film_transformer.py | the Gaussian Error Linear Units paper https //arxiv.org/abs/1606.08415 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\transformer_2d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\transformer_temporal.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_1d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_1d_blocks.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | https //arxiv.org/abs/2103.06104 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | https //arxiv.org/abs/2103.06104 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | Cross Attention 2D Mid-level block - original architecture from Unet transformers https //arxiv.org/abs/2103.06104 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_blocks_flax.py | enable memory efficient attention https //arxiv.org/abs/2112.05682 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | This model is also a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | Enable memory efficient attention as described [here](https //arxiv.org/abs/2112.05682). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_2d_condition_flax.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_blocks.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\unet_3d_condition.py | chunking](https //huggingface.co/blog/reformer#2-chunked-feed-forward-layers). | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | # JAX implementation of VQGAN from taming-transformers https //github.com/CompVis/taming-transformers | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | This model is a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | This model is a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | This model is a Flax Linen [flax.linen.Module](https //flax.readthedocs.io/en/latest/flax.linen.html#module) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Just-In-Time (JIT) compilation](https //jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Automatic Differentiation](https //jax.readthedocs.io/en/latest/jax.html#automatic-differentiation) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Vectorization](https //jax.readthedocs.io/en/latest/jax.html#vectorization-vmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | - [Parallelization](https //jax.readthedocs.io/en/latest/jax.html#parallelization-pmap) | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vae_flax.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vq_model.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\vq_model.py | Synthesis with Latent Diffusion Models](https //arxiv.org/abs/2112.10752) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\models\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\optimization.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\optimization.py | https //github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf). Guidance rescale factor should fix overexposure when | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\alt_diffusion\pipeline_alt_diffusion_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | [laion/clap-htsat-unfused](https //huggingface.co/laion/clap-htsat-unfused) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm\pipeline_audioldm.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\modeling_audioldm2.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\modeling_audioldm2.py | "At the moment it is not possible to define the number of attention heads via `num_attention_heads` because of a naming issue as described in https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing `num_attention_heads` will only be supported in diffusers v0.19." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\modeling_audioldm2.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [CLAP](https //huggingface.co/docs/transformers/model_doc/clap#transformers.CLAPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | specifically the [laion/clap-htsat-unfused](https //huggingface.co/laion/clap-htsat-unfused) variant. The | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [T5](https //huggingface.co/docs/transformers/model_doc/t5#transformers.T5EncoderModel), specifically the | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [google/flan-t5-large](https //huggingface.co/google/flan-t5-large) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audioldm2\pipeline_audioldm2.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\mel.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\mel.py | An audio file that must be on disk due to [Librosa](https //librosa.org/) limitation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | An audio file that must be on disk due to [Librosa](https //librosa.org/) limitation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) used to denoise. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\audio_diffusion\pipeline_audio_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\auto_pipeline.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\consistency_models\pipeline_consistency_models.py | >>> # https //github.com/openai/consistency_models/blob/main/scripts/launch.sh#L77 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\consistency_models\pipeline_consistency_models.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\multicontrolnet.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/input_image_vermeer.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # This model implementation is heavily inspired by https //github.com/haofanwang/ControlNet-for-Diffusers/ | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ([runwayml/stable-diffusion-inpainting](https //huggingface.co/runwayml/stable-diffusion-inpainting)) as well as | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | ([runwayml/stable-diffusion-v1-5](https //huggingface.co/runwayml/stable-diffusion-v1-5)). Default text-to-image | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | [lllyasviel/control_v11p_sd15_inpaint](https //huggingface.co/lllyasviel/control_v11p_sd15_inpaint). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | ... "https://huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" //huggingface.co/datasets/diffusers/test-arrays/resolve/main/stable_diffusion_inpaint/boy_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_inpaint_sd_xl.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | ... "https://hf.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd_controlnet/hf-logo.png" //hf.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd_controlnet/hf-logo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | ([laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | Whether to use the [invisible_watermark](https //github.com/ShieldMnt/invisible-watermark/) library to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_controlnet_sd_xl_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | ... "https://huggingface.co/datasets/YiYiXu/test-doc-assets/resolve/main/blog_post_cell_10_output_0.jpeg" //huggingface.co/datasets/YiYiXu/test-doc-assets/resolve/main/blog_post_cell_10_output_0.jpeg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\controlnet\pipeline_flax_controlnet.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dance_diffusion\pipeline_dance_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dance_diffusion\pipeline_dance_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddim\pipeline_ddim.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddim\pipeline_ddim.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddim\pipeline_ddim.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddpm\pipeline_ddpm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\ddpm\pipeline_ddpm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_img2img_superresolution.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/person.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | >>> url = "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/if/glasses_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_inpainting_superresolution.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\pipeline_if_superresolution.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\deepfloyd_if\watermark.py | # copied from https //github.com/deep-floyd/IF/blob/b77482e36ca2031cb94dbca1001fc1e6400bf4ab/deepfloyd_if/modules/base.py#L287 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dit\pipeline_dit.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\dit\pipeline_dit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_inpaint.py | "THIS means that you HAVE to invert the input mask to have the same behavior as before as explained in https://github.com/huggingface/diffusers/pull/4207. " //github.com/huggingface/diffusers/pull/4207. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky\pipeline_kandinsky_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" "/kandinsky/cat.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_controlnet_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_inpainting.py | "THIS means that you HAVE to invert the input mask to have the same behavior as before as explained in https://github.com/huggingface/diffusers/pull/4207. " //github.com/huggingface/diffusers/pull/4207. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | ... "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\kandinsky2_2\pipeline_kandinsky2_2_prior_emb2emb.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | # See all LDMBert models at https //huggingface.co/models?filter=ldmbert | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion.py | "ldm-bert": "https://huggingface.co/valhalla/ldm-bert/blob/main/config.json", "https://huggingface.co/valhalla/ldm-bert/blob/main/config.json", //huggingface.co/valhalla/ldm-bert/blob/main/config.json", | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | ... "https://user-images.githubusercontent.com/38061659/199705896-b48e17b8-b231-47cd-a270-4ffa5a93fa3e.png" //user-images.githubusercontent.com/38061659/199705896-b48e17b8-b231-47cd-a270-4ffa5a93fa3e.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion\pipeline_latent_diffusion_superresolution.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion_uncond\pipeline_latent_diffusion_uncond.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\latent_diffusion_uncond\pipeline_latent_diffusion_uncond.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | [laion/clap-htsat-unfused](https //huggingface.co/laion/clap-htsat-unfused) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\musicldm\pipeline_musicldm.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\onnx_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\image_encoder.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | ... "https://raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/image/example_1.png" //raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/image/example_1.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | ... "https://raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/mask/example_1.png" //raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/mask/example_1.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | >>> example_url = "https://raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/reference/example_1.jpg" //raw.githubusercontent.com/Fantasy-Studio/Paint-by-Example/main/examples/reference/example_1.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\paint_by_example\pipeline_paint_by_example.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | [鈥渙ffline-mode鈥漖(https //huggingface.co/diffusers/installation.html#offline-mode) to use this method in a | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_flax_utils.py | >>> # see more in [the documentation](https //huggingface.co/docs/hub/security-tokens) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | f"You are loading the variant {revision} from {pretrained_model_name_or_path} via `revision='{revision}'`. This behavior is deprecated and will be removed in diffusers v1. One should use `variant='{revision}'` instead. However, it appears that {pretrained_model_name_or_path} currently does not have the required variant filenames in the 'main' branch. \n The Diffusers team and community would be very grateful if you could open an issue: https://github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {revision} files' so that the correct variant file can be added.", https //github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {revision} files' so that the correct variant file can be added.", | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | [Community](https //github.com/huggingface/diffusers/tree/main/examples/community). Valid file | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | Pipelines](https //huggingface.co/docs/diffusers/using-diffusers/custom_pipeline_overview) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | map](https //hf.co/docs/accelerate/main/en/usage_guides/big_modeling#designing-a-device-map). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | To use private or [gated](https //huggingface.co/docs/hub/models-gated#gated-models) models, log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | >>> # of the documentation](https //huggingface.co/docs/hub/security-tokens) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | " checkpoint: https://huggingface.co/runwayml/stable-diffusion-inpainting instead or adapting your" https //huggingface.co/runwayml/stable-diffusion-inpainting instead or adapting your" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | " https://huggingface.co/runwayml/stable-diffusion-inpainting. Note that we do not actively maintain" //huggingface.co/runwayml/stable-diffusion-inpainting. Note that we do not actively maintain" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | [Community](https //github.com/huggingface/diffusers/tree/main/examples/community). Valid file | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | community pipeline](https //huggingface.co/docs/diffusers/main/en/using-diffusers/contribute_pipeline). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | Enable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). When this | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | [`memory_efficient_attention()`](https //facebookresearch.github.io/xformers/components/ops.html#xformers.ops.memory_efficient_attention) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pipeline_utils.py | Disable memory efficient attention from [xFormers](https //facebookresearch.github.io/xformers/). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pndm\pipeline_pndm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pndm\pipeline_pndm.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\pndm\pipeline_pndm.py | # the official paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | RePaint paper). Take a look at Figure 9 and 10 in the [paper](https //arxiv.org/pdf/2201.09865.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | and 10 in the [paper](https //arxiv.org/pdf/2201.09865.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | >>> img_url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/celeba_hq_256.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/celeba_hq_256.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\repaint\pipeline_repaint.py | >>> mask_url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/mask_256.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/repaint/mask_256.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\score_sde_ve\pipeline_score_sde_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\score_sde_ve\pipeline_score_sde_ve.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\semantic_stable_diffusion\pipeline_semantic_stable_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\camera.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e_img2img.py | >>> image_url = "https://hf.co/datasets/diffusers/docs-images/resolve/main/shap-e/corgi.png" //hf.co/datasets/diffusers/docs-images/resolve/main/shap-e/corgi.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\pipeline_shap_e_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\renderer.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\shap_e\renderer.py | Reference https //arxiv.org/pdf/2210.04628.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\continous_encoder.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\midi_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\notes_encoder.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\pipeline_spectrogram_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\pipeline_spectrogram_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\spectrogram_diffusion\pipeline_spectrogram_diffusion.py | >>> # Download MIDI from wget http //www.piano-midi.de/midis/beethoven/beethoven_hammerklavier_2.mid | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\clip_image_project_model.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # model components i.e. https //huggingface.co/thibaud/controlnet-sd21/ | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | An instance of [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | to use, specifically the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # "state_dict" key https://huggingface.co/thibaud/controlnet-canny-sd21 //huggingface.co/thibaud/controlnet-canny-sd21 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" //raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml" //raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml" //raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | config_url = "https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml" //raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\convert_from_ckpt.py | # "state_dict" key https://huggingface.co/thibaud/controlnet-canny-sd21 //huggingface.co/thibaud/controlnet-canny-sd21 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | url = "https://raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/An%20astronaut%20riding%20a%20horse.png" //raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/An%20astronaut%20riding%20a%20horse.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # See more samples at the original repo https //github.com/ChenWu98/cycle-diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | "https://raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/A%20black%20colored%20car.png" //raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/A%20black%20colored%20car.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_cycle_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | >>> img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | >>> mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_flax_stable_diffusion_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | Corresponds to parameter eta (?) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_inpaint_legacy.py | # eta corresponds to ? in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_onnx_stable_diffusion_upscale.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf). Guidance rescale factor should fix overexposure when | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_attend_and_excite.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_controlnet.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | >>> url = "http://images.cocodataset.org/val2017/000000039769.jpg" //images.cocodataset.org/val2017/000000039769.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_depth2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | >>> img_url = "https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" //github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | >>> img_url = "https://github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" //github.com/Xiang-cd/DiffEdit-stable-diffusion/raw/main/assets/origin.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_diffedit.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Generation](https //arxiv.org/pdf/2301.07093.pdf). Scheduled Sampling factor is only varied for | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf). Guidance rescale factor should fix overexposure when | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" //hf.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/livingroom_modern.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/backpack.jpeg" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/backpack.jpeg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/pexels-pixabay-60597.jpg" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/gligen/pexels-pixabay-60597.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | ... "https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" //huggingface.co/datasets/huggingface/documentation-images/resolve/main/diffusers/landscape.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Frozen image-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Generation](https //arxiv.org/pdf/2301.07093.pdf). Scheduled Sampling factor is only varied for | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_gligen_text_image.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Frozen CLIP image-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | [`CLIPImageProcessor`](https //huggingface.co/lambdalabs/sd-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | url = "https://lh3.googleusercontent.com/y-iFOHfLTwkuQSUegpwDdgKmOjRSTvPxat63dQLB25xkTs4lhIbRUFeNBWZzYf370g=s1200" //lh3.googleusercontent.com/y-iFOHfLTwkuQSUegpwDdgKmOjRSTvPxat63dQLB25xkTs4lhIbRUFeNBWZzYf370g=s1200" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_image_variation.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | >>> img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | >>> mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | "by loading your model into `StableDiffusionInpaintPipeline` instead. See https://github.com/huggingface/diffusers/pull/3533" //github.com/huggingface/diffusers/pull/3533" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_inpaint_legacy.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | >>> img_url = "https://huggingface.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" //huggingface.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_instruct_pix2pix.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | " as defined in https://huggingface.co/docs/diffusers/api/schedulers#implemented-schedulers for" //huggingface.co/docs/diffusers/api/schedulers#implemented-schedulers for" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_k_diffusion.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_latent_upscale.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_ldm3d.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Apply model editing via closed-form solution (see Eq. 5 in the TIME [paper](https //arxiv.org/abs/2303.08084)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_model_editing.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # Here, we define the mappings F_i (see Eq. 7 in the MultiDiffusion paper https //arxiv.org/abs/2302.08113) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # MultiDiffusion paper for more details https //arxiv.org/abs/2302.08113 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_panorama.py | # take the MultiDiffusion step. Eq. 5 in MultiDiffusion paper https //arxiv.org/abs/2302.08113 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_paradigms.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | >>> source_emb_url = "https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/cat.pt" //hf.co/datasets/sayakpaul/sample-datasets/resolve/main/cat.pt" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | >>> target_emb_url = "https://hf.co/datasets/sayakpaul/sample-datasets/resolve/main/dog.pt" //hf.co/datasets/sayakpaul/sample-datasets/resolve/main/dog.pt" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | >>> img_url = "https://github.com/pix2pixzero/pix2pix-zero/raw/main/assets/test_images/cats/cat_6.png" //github.com/pix2pixzero/pix2pix-zero/raw/main/assets/test_images/cats/cat_6.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | paper](https //arxiv.org/abs/2302.03027). Used in discovering the edit direction. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | paper](https //arxiv.org/abs/2302.03027). Used in discovering the edit direction. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_pix2pix_zero.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # Modified to get self-attention guidance scale in this paper (https //arxiv.org/pdf/2210.00939.pdf) as an input | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # of the self-attentnion guidance paper https //arxiv.org/pdf/2210.00939.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # in https //arxiv.org/pdf/2210.00939.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_sag.py | # Same masking process as in SAG paper https //arxiv.org/pdf/2210.00939.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | >>> url = "https://huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png" //huggingface.co/datasets/hf-internal-testing/diffusers-images/resolve/main/sd2-upscale/low_res_cat.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_diffusion_upscale.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | >>> url = "https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" //raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\pipeline_stable_unclip_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\safety_checker.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\safety_checker_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion\stable_unclip_image_normalizer.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\pipeline_stable_diffusion_safe.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_safe\safety_checker.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | >>> url = "https://huggingface.co/datasets/patrickvonplaten/images/resolve/main/aa_xl/000000009.png" //huggingface.co/datasets/patrickvonplaten/images/resolve/main/aa_xl/000000009.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_img2img.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | >>> img_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | >>> mask_url = "https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" //raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). Can be used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_inpaint.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | ... "https://hf.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" //hf.co/datasets/diffusers/diffusers-images-docs/resolve/main/mountain.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModelWithProjection), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [laion/CLIP-ViT-bigG-14-laion2B-39B-b160k](https //huggingface.co/laion/CLIP-ViT-bigG-14-laion2B-39B-b160k) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Whether to use the [invisible_watermark library](https //github.com/ShieldMnt/invisible-watermark/) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\pipeline_stable_diffusion_xl_instruct_pix2pix.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stable_diffusion_xl\watermark.py | # Copied from https //github.com/Stability-AI/generative-models/blob/613af104c6b85184091d42d374fef420eddb356d/scripts/demo/streamlit_helpers.py#L66 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stochastic_karras_ve\pipeline_stochastic_karras_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\stochastic_karras_ve\pipeline_stochastic_karras_ve.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | ... "https://huggingface.co/datasets/diffusers/docs-images/resolve/main/t2i-adapter/color_ref.png" //huggingface.co/datasets/diffusers/docs-images/resolve/main/t2i-adapter/color_ref.png" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | https //arxiv.org/abs/2302.08453 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | " information, please have a look at https://github.com/huggingface/diffusers/pull/254 ." //github.com/huggingface/diffusers/pull/254 ." | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_adapter.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | >>> sketch_image = load_image("https://huggingface.co/Adapter/t2iadapter/resolve/main/sketch.png").convert("L") //huggingface.co/Adapter/t2iadapter/resolve/main/sketch.png").convert("L") | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). See Section 3.4 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | https //arxiv.org/abs/2302.08453 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [CLIP](https //huggingface.co/docs/transformers/model_doc/clip#transformers.CLIPTextModel), specifically | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | the [clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14) variant. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Please, refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for details. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [stabilityai/stable-diffusion-xl-base-1.0](https //huggingface.co/stabilityai/stable-diffusion-xl-base-1.0) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Output**](https //huggingface.co/docs/diffusers/api/pipelines/stable_diffusion/stable_diffusion_xl#refining-the-image-output) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Corresponds to parameter eta (畏) in the DDIM paper https //arxiv.org/abs/2010.02502. Only applies to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [PIL](https //pillow.readthedocs.io/en/stable/) `PIL.Image.Image` or `np.array`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [diffusers.models.attention_processor](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | Flawed](https //arxiv.org/pdf/2305.08891.pdf) `guidance_scale` is defined as `蠁` in equation 16. of | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [Common Diffusion Noise Schedules and Sample Steps are Flawed](https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | section 2.2 of [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | [https //huggingface.co/papers/2307.01952](https //huggingface.co/papers/2307.01952). For more | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | information, refer to this issue thread https //github.com/huggingface/diffusers/issues/4208. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\t2i_adapter\pipeline_stable_diffusion_xl_adapter.py | # Based on 3.4. in https //arxiv.org/pdf/2305.08891.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # This code is copied from https //github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # This code is copied from https //github.com/modelscope/modelscope/blob/1509fdb973e5871f37148a4b5e5964cafd43e64d/modelscope/pipelines/multi_modal/text_to_video_synthesis_pipeline.py#L78 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_synth_img2img.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | # Adapted from https //github.com/princeton-vl/RAFT/blob/master/core/utils/utils.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | [`self.processor`](https //github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Strength of motion in generated video along x-axis. See the [paper](https //arxiv.org/abs/2303.13439), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | Strength of motion in generated video along y-axis. See the [paper](https //arxiv.org/abs/2303.13439), | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | [paper](https //arxiv.org/abs/2303.13439), Sect. 3.3.1. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | [paper](https //arxiv.org/abs/2303.13439), Sect. 3.3.1. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\text_to_video_synthesis\pipeline_text_to_video_zero.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | Frozen CLIP image-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | [configuration](https //huggingface.co/fusing/karlo-image-variations-diffusers/blob/main/feature_extractor/preprocessor_config.json). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\pipeline_unclip_image_variation.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\text_proj.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unclip\text_proj.py | For more details, see the original paper https //arxiv.org/abs/2204.06125 section 2.1 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_text_decoder.py | # Modified from ClipCaptionModel in https //github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_text_decoder.py | Text decoder model for a image-text [UniDiffuser](https //arxiv.org/pdf/2303.06555.pdf) model. This is used to | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_text_decoder.py | code](https //github.com/thu-ml/unidiffuser/blob/main/libs/caption_decoder.py#L89). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | # Method based on https //people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | implementation](https //github.com/thu-ml/unidiffuser/blob/main/libs/uvit_multi_post_ln_v1.py#L104). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | # https //github.com/baofff/U-ViT | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | Transformer model based on the [U-ViT](https //github.com/baofff/U-ViT) architecture for image-like data. Compared | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\modeling_uvit.py | Transformer model for a image-text [UniDiffuser](https //arxiv.org/pdf/2303.06555.pdf) model. This is a | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | A [U-ViT](https //github.com/baofff/U-ViT) model with UNNet-style skip connections between transformer | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | [UniDiffuser-v1](https //huggingface.co/thu-ml/unidiffuser-v1) checkpoint. | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\unidiffuser\pipeline_unidiffuser.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\modeling_text_unet.py | " https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing" //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131. Passing" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\modeling_text_unet.py | # when this library was created. The incorrect naming was only discovered much later in https //github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Frozen text-encoder ([clip-vit-large-patch14](https //huggingface.co/openai/clip-vit-large-patch14)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Please refer to the [model card](https //huggingface.co/runwayml/stable-diffusion-v1-5) for more details | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_dual_guided.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | >>> url = "https://huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" //huggingface.co/datasets/diffusers/images/resolve/main/benz.jpg" | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_image_variation.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | # eta corresponds to 畏 in DDIM paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | Corresponds to parameter eta (畏) from the [DDIM](https //arxiv.org/abs/2010.02502) paper. Only applies | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\versatile_diffusion\pipeline_versatile_diffusion_text_to_image.py | # of the Imagen paper https //arxiv.org/pdf/2205.11487.pdf . `guidance_scale = 1` | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | Frozen text-encoder ([clip-vit-base-patch32](https //huggingface.co/openai/clip-vit-base-patch32)). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | # https //github.com/huggingface/transformers/blob/d92e22d1f28324f513f3080e5c47c071a3916721/src/transformers/models/clip/modeling_clip.py#L1052-L1053 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\vq_diffusion\pipeline_vq_diffusion.py | A [`torch.Generator`](https //pytorch.org/docs/stable/generated/torch.Generator.html) to make | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_paella_vq_model.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_common.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_common.py | # from https //github.com/facebookresearch/ConvNeXt-V2/blob/3608f67cc1dae164790c5d0aead7bf2d73d9719b/models/utils.py#L105 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_diffnext.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\modeling_wuerstchen_prior.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting `guidance_scale > | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_combined.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | [CLIPTokenizer](https //huggingface.co/docs/transformers/v4.21.0/en/model_doc/clip#transformers.CLIPTokenizer). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | Guidance scale as defined in [Classifier-Free Diffusion Guidance](https //arxiv.org/abs/2207.12598). | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | Paper](https //arxiv.org/pdf/2205.11487.pdf). Guidance scale is enabled by setting | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipelines\wuerstchen\pipeline_wuerstchen_prior.py | One or a list of [torch generator(s)](https //pytorch.org/docs/stable/generated/torch.Generator.html) | 问题引导 | -| 开源代码引入 | .\src\diffusers\pipeline_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | [paper](https //huggingface.co/papers/2206.00364). Defaults to 0.5 from the original implementation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | [paper](https //huggingface.co/papers/2206.00364). Defaults to 7.0 from the original implementation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | # See https //github.com/openai/consistency_models/blob/main/cm/karras_diffusion.py#L675 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_consistency_models.py | [paper](https //huggingface.co/papers/2303.01469)) to enforce boundary condition. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | Rescales betas to have zero terminal SNR Based on https //arxiv.org/pdf/2305.08891.pdf (Algorithm 1) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | [`--offset_noise`](https //github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim.py | # 7. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | For more details, see the original paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # 5. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_flax.py | # 6. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | Rescales betas to have zero terminal SNR Based on https //arxiv.org/pdf/2305.08891.pdf (Algorithm 1) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | [`--offset_noise`](https //github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # "leading" and "trailing" corresponds to annotation of Table 1. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # 5. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_inverse.py | # 6. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # DISCLAIMER This code is strongly influenced by https //github.com/pesser/pytorch_diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # and https //github.com/hojonathanho/diffusion | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | Rescales betas to have zero terminal SNR Based on https //arxiv.org/pdf/2305.08891.pdf (Algorithm 1) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | For more details, see the original paper https //arxiv.org/abs/2010.02502 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | whether to use the "dynamic thresholding" method (introduced by Imagen, https://arxiv.org/abs/2205.11487). //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | (https //arxiv.org/abs/2205.11487). Valid only when `thresholding=True`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | Steps are Flawed](https //arxiv.org/abs/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | whether to rescale the betas to have zero terminal SNR (proposed by https //arxiv.org/pdf/2305.08891.pdf). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | [`--offset_noise`](https //github.com/huggingface/diffusers/blob/74fd735eb073eb1d774b1ab4154a0876eb82f055/examples/dreambooth/train_dreambooth.py#L506). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | CycleDiffusion. (https //arxiv.org/abs/2210.05559) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 7. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # See formulas (12) and (16) of DDIM paper https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # "predicted x_0" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 6. compute "direction pointing to x_t" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddim_parallel.py | # 7. compute x_t without "random noise" of formula (12) from https://arxiv.org/pdf/2010.02502.pdf //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # for rl-diffuser https //arxiv.org/abs/2205.09991 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | For more details, see the original paper https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # for rl-diffuser https //arxiv.org/abs/2205.09991 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_flax.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | For more details, see the original paper https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | whether to use the "dynamic thresholding" method (introduced by Imagen, https://arxiv.org/abs/2205.11487). //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | (https //arxiv.org/abs/2205.11487). Valid only when `thresholding=True`. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | Steps are Flawed](https //arxiv.org/abs/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # for rl-diffuser https //arxiv.org/abs/2205.09991 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_parallel.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_wuerstchen.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_wuerstchen.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ddpm_wuerstchen.py | For more details, see the original paper https //arxiv.org/abs/2006.11239 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # DISCLAIMER check https //arxiv.org/abs/2204.13902 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # The codebase is modified based on https //github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_deis_multistep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | `dpmsolver` type implements the algorithms in the [DPMSolver](https //huggingface.co/papers/2206.00927) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | [DPMSolver++](https //huggingface.co/papers/2211.01095) paper. It is recommended to use `dpmsolver++` or | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | For more details, see the original paper https //arxiv.org/abs/2206.00927 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | We also support the "dynamic thresholding" method in Imagen (https://arxiv.org/abs/2205.11487). For pixel-space //arxiv.org/abs/2205.11487). For pixel-space | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | For more details, see the original paper https //arxiv.org/abs/2206.00927 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | whether to use the "dynamic thresholding" method (introduced by Imagen, https://arxiv.org/abs/2205.11487). //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | (https //arxiv.org/abs/2205.11487). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | algorithms in https //arxiv.org/abs/2206.00927, and the `dpmsolver++` type implements the algorithms in | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | https //arxiv.org/abs/2211.01095. We recommend to use `dpmsolver++` with `solver_order=2` for guided | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # Dynamic thresholding in https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | See https //arxiv.org/abs/2206.00927 for the detailed derivation. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_flax.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | `dpmsolver` type implements the algorithms in the [DPMSolver](https //huggingface.co/papers/2206.00927) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | [DPMSolver++](https //huggingface.co/papers/2211.01095) paper. It is recommended to use `dpmsolver++` or | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_multistep_inverse.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | Generative Models](https //huggingface.co/papers/2206.00364) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_sde.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # DISCLAIMER This file is strongly influenced by https //github.com/LuChengTHU/dpm-solver | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | `dpmsolver` type implements the algorithms in the [DPMSolver](https //huggingface.co/papers/2206.00927) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | [DPMSolver++](https //huggingface.co/papers/2211.01095) paper. It is recommended to use `dpmsolver++` or | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2211.01095 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_dpmsolver_singlestep.py | # See https //arxiv.org/abs/2206.00927 for detailed derivations | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_ancestral_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_euler_discrete.py | # Copied from https //github.com/crowsonkb/k-diffusion/blob/686dbad0f39640ea25c8a8c6a6e56bb40eacefa2/k_diffusion/sampling.py#L17 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_heun_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ipndm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_ipndm.py | # For more information on the algorithm please take a look at the paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve.py | For more details on the parameters, see [Appendix E](https //arxiv.org/abs/2206.00364). The grid search values used | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | https //arxiv.org/abs/2206.00364 [2] Song, Yang, et al. "Score-based generative modeling through stochastic | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | differential equations." https://arxiv.org/abs/2011.13456 //arxiv.org/abs/2011.13456 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_karras_ve_flax.py | Diffusion-Based Generative Models." https://arxiv.org/abs/2206.00364. The grid search values used to find the //arxiv.org/abs/2206.00364. The grid search values used to find the | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | the Design Space of Diffusion-Based Generative Models](https //huggingface.co/papers/2206.00364) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_ancestral_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | Diffusion-Based Generative Models](https //huggingface.co/papers/2206.00364) paper. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_k_dpm_2_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete_flax.py | https //github.com/crowsonkb/k-diffusion/blob/481677d114f6ea445aa009cf5bd7a9cdee909e47/k_diffusion/sampling.py#L181 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_lms_discrete_flax.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | or `v_prediction` (see section 2.4 of [Imagen Video](https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # For more information on the algorithm please take a look at the paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # is based on crowsonkb's PLMS sampler implementation https //github.com/CompVis/latent-diffusion/pull/51 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | "See: https://github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py " https //github.com/huggingface/diffusers/blob/main/src/diffusers/pipelines/pipeline_pndm.py " | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm.py | # See formula (9) of PNDM paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/ermongroup/ddim | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | For more details, see the original paper https //arxiv.org/abs/2202.09778 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | https //imagen.research.google/video/paper.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # For more information on the algorithm please take a look at the paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # is based on crowsonkb's PLMS sampler implementation https //github.com/CompVis/latent-diffusion/pull/51 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_pndm_flax.py | # See formula (9) of PNDM paper https //arxiv.org/pdf/2202.09778.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # https //arxiv.org/pdf/2006.11239.pdf) and sample from it to get | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # Is equivalent to formula (16) in https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # from https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 7. compute x_{t-1} of formula (12) from https //arxiv.org/pdf/2010.02502.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 8. Algorithm 1 Line 5 https //arxiv.org/pdf/2201.09865.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 9. Algorithm 1 Line 8 https //arxiv.org/pdf/2201.09865.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_repaint.py | # 10. Algorithm 1 Line 10 https //arxiv.org/pdf/2201.09865.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve.py | # DISCLAIMER This file is strongly influenced by https //github.com/yang-song/score_sde_pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve_flax.py | # DISCLAIMER This file is strongly influenced by https //github.com/yang-song/score_sde_pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_ve_flax.py | For more information, see the original paper https //arxiv.org/abs/2011.13456 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_vp.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_sde_vp.py | # DISCLAIMER This file is strongly influenced by https //github.com/yang-song/score_sde_pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # For t > 0, compute predicted variance 尾t (see formula (6) and (7) from https //arxiv.org/pdf/2006.11239.pdf) | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # "predicted x_0" of formula (15) from https://arxiv.org/pdf/2006.11239.pdf //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unclip.py | # See formula (7) from https //arxiv.org/pdf/2006.11239.pdf | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # DISCLAIMER check https //arxiv.org/abs/2302.04867 and https | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # The codebase is modified based on https //github.com/huggingface/diffusers/blob/main/src/diffusers/schedulers/scheduling_dpmsolver_multistep.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | Video](https //imagen.research.google/video/paper.pdf) paper). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | Sample Steps are Flawed](https //huggingface.co/papers/2305.08891) for more information. | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | # "linspace", "leading", "trailing" corresponds to annotation of Table 2. of https://arxiv.org/abs/2305.08891 //arxiv.org/abs/2305.08891 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_unipc_multistep.py | https //arxiv.org/abs/2205.11487 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | To use private or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models), log-in with | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils.py | ["offline-mode"](https://huggingface.co/diffusers/installation.html#offline-mode) to use this method in a //huggingface.co/diffusers/installation.html#offline-mode) to use this method in a | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | models](https //huggingface.co/docs/hub/models-gated#gated-models). | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_utils_flax.py | Activate the special ["offline-mode"](https://huggingface.co/transformers/installation.html#offline-mode) to //huggingface.co/transformers/installation.html#offline-mode) to | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\scheduling_vq_diffusion.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\schedulers\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\training_utils.py | # Adapted from torch-ema https //github.com/fadel/pytorch_ema/blob/master/torch_ema/ema.py#L14 | 问题引导 | -| 开源代码引入 | .\src\diffusers\training_utils.py | # https //pytorch.org/tutorials/beginner/saving_loading_models.html#what-is-a-state-dict | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\accelerate_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\constants.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\constants.py | HUGGINGFACE_CO_RESOLVE_ENDPOINT = "https://huggingface.co" //huggingface.co" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\doc_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | "https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py" //raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | url = "https://pypi.org/pypi/diffusers/json" //pypi.org/pypi/diffusers/json" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models). | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | 'http //hostname' 'foo.bar | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\dynamic_modules_utils.py | or [gated models](https //huggingface.co/docs/hub/models-gated#gated-models). | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | "file an issue at https://github.com/huggingface/diffusers/issues/new/choose, copy paste this whole " //github.com/huggingface/diffusers/issues/new/choose, copy paste this whole " | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | f"You are loading the variant {revision} from {pretrained_model_name_or_path} via `revision='{revision}'`. This behavior is deprecated and will be removed in diffusers v1. One should use `variant='{revision}'` instead. However, it appears that {pretrained_model_name_or_path} currently does not have a {_add_variant(weights_name, revision)} file in the 'main' branch of {pretrained_model_name_or_path}. \n The Diffusers team and community would be very grateful if you could open an issue: https://github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {_add_variant(weights_name, revision)}' so that the correct variant file can be added.", https //github.com/huggingface/diffusers/issues/new with the title '{pretrained_model_name_or_path} is missing {_add_variant(weights_name, revision)}' so that the correct variant file can be added.", | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | "listed on 'https://huggingface.co/models'\nIf this is a private repository, make sure to pass a " //huggingface.co/models'\nIf this is a private repository, make sure to pass a " | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | f"'https://huggingface.co/{pretrained_model_name_or_path}' for available revisions." //huggingface.co/{pretrained_model_name_or_path}' for available revisions." | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | " offline mode at 'https://huggingface.co/docs/diffusers/installation#offline-mode'." //huggingface.co/docs/diffusers/installation#offline-mode'." | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\hub_utils.py | "'https://huggingface.co/models', make sure you don't have a local directory with the same name. " //huggingface.co/models', make sure you don't have a local directory with the same name. " | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation page https //github.com/google/flax and follow the ones that match your environment. | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation page https //pytorch.org/get-started/locally/ and follow the ones that match your environment. | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation page https //librosa.org/doc/latest/install.html and follow the ones that match your environment. | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | installation section https //github.com/rspeer/python-ftfy/tree/master#installing and follow the ones | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # This function was copied from https //github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L319 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # This function was copied from https //github.com/huggingface/accelerate/blob/874c4967d94badd24f893064cc3bef45f57cadf7/src/accelerate/utils/versions.py#L338 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\import_utils.py | # https //github.com/optuna/optuna/blob/master/optuna/integration/__init__.py | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\loading_utils.py | if image.startswith("http://") or image.startswith("https://"): //") or image.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\loading_utils.py | f"Incorrect path or url, URLs must start with `http://` or `https://`, and {image} is not a valid path" //` or `https //`, and {image} is not a valid path" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\logging.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\outputs.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | Decorator marking a test that requires compel https //github.com/damian0815/compel. These tests are skipped when | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | elif arry.startswith("http://") or arry.startswith("https://"): //") or arry.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | f"Incorrect path or url, URLs must start with `http://` or `https://`, and {arry} is not a valid path" //` or `https //`, and {arry} is not a valid path" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | if image.startswith("http://") or image.startswith("https://"): //") or image.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | f"Incorrect path or url, URLs must start with `http://` or `https://`, and {image} is not a valid path" //` or `https //`, and {image} is not a valid path" | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | if not path.startswith("http://") or path.startswith("https://"): //") or path.startswith("https://"): //"): | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | "https://huggingface.co/datasets/fusing/diffusers-testing/resolve/main", urllib.parse.quote(path) //huggingface.co/datasets/fusing/diffusers-testing/resolve/main", urllib.parse.quote(path) | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | # adapted from https //github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | # adapted from https //github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | # Taken from https //github.com/huggingface/transformers/blob/3658488ff77ff8d45101293e749263acf437f4d5/src/transformers/testing_utils.py#L1787 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\testing_utils.py | - https //pytorch.org/docs/stable/notes/randomness.html for pytorch | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\torch_utils.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\__init__.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\src\diffusers\utils\__init__.py | "`https://huggingface.co/docs/diffusers/installation#install-from-source`)," //huggingface.co/docs/diffusers/installation#install-from-source`)," | 问题引导 | -| 开源代码引入 | .\src\diffusers\__init__.py | # https //github.com/huggingface/transformers/blob/main/src/transformers/__init__.py | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | # For example, `[bert-base-uncased](https //huggingface.co/bert-base-uncased)` | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | _re_checkpoint = re.compile("\[(.+?)\]\((https://huggingface\.co/.+?)\)") //huggingface\.co/.+?)\)") | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | # For example, `('bert-base-uncased', 'https //huggingface.co/bert-base-uncased')` | 问题引导 | -| 开源代码引入 | .\utils\check_config_docstrings.py | ckpt_link_from_name = f"https://huggingface.co/{ckpt_name}" //huggingface.co/{ckpt_name}" | 下载预训练模型 | -| 开源代码引入 | .\utils\check_copies.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_doc_toc.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_dummies.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_inits.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_repo.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_repo.py | "(`pip install git+https://github.com/huggingface/doc-builder`)" //github.com/huggingface/doc-builder`)" | 问题引导 | -| 开源代码引入 | .\utils\check_table.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\check_table.py | # Thanks to https //stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 问题引导 | -| 开源代码引入 | .\utils\custom_init_isort.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\get_modified_files.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\overwrite_expected_slice.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\print_env.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\release.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\release.py | "https://huggingface.co/docs/diffusers/main/model_doc", //huggingface.co/docs/diffusers/main/model_doc", | 问题引导 | -| 开源代码引入 | .\utils\release.py | "https://huggingface.co/docs/diffusers/model_doc", //huggingface.co/docs/diffusers/model_doc", | 问题引导 | -| 开源代码引入 | .\utils\stale.py | # http //www.apache.org/licenses/LICENSE-2.0 | 问题引导 | -| 开源代码引入 | .\utils\stale.py | https //github.com/allenai/allennlp. | 问题引导 | -| 开源代码引入 | .\utils\stale.py | "[contributing guidelines](https://github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md) " //github.com/huggingface/diffusers/blob/main/CONTRIBUTING.md) " | 问题引导 | -| 开源代码引入 | .\_typos.toml | # Instruction https //github.com/marketplace/actions/typos-action#getting-started | 问题引导 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet_flax.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet_sdxl.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/controlnet/train_controlnet_sdxl.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth.py | https://pytorch.org/docs/stable/generated/torch.optim.Optimizer.zero_grad.html | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth.py | https://www.crosslabs.org//blog/diffusion-with-offset-noise | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth_flax.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth_lora_sdxl.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth_lora_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/dreambooth/train_dreambooth_lora_sdxl.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_flax.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_lora.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_lora_sdxl.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_lora_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_lora_sdxl.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_sdxl.py | https://arxiv.org/abs/2303.09556 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_sdxl.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/examples/text_to_image/train_text_to_image_sdxl.py | https://www.tensorflow.org/tensorboard | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/setup.py | patrick@huggingface.co | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/loaders/single_file.py | https://raw.githubusercontent.com/lllyasviel/ControlNet/main/models/cldm_v15.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/loaders/single_file.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/models/controlnet_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/models/modeling_pytorch_flax_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/models/unet_2d_condition_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/models/vae_flax.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/deprecated/stable_diffusion_variants/pipeline_cycle_diffusion.py | https://raw.githubusercontent.com/ChenWu98/cycle-diffusion/main/data/dalle2/An%20astronaut%20riding%20a%20horse.png | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/kandinsky/pipeline_kandinsky_combined.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/kandinsky2_2/pipeline_kandinsky2_2_combined.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/assets/stable-samples/img2img/sketch-mountains-input.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/x4-upscaling.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/stablediffusion/main/configs/stable-diffusion/v2-inference-v.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_refiner.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/stable_diffusion/convert_from_ckpt.py | https://raw.githubusercontent.com/Stability-AI/generative-models/main/configs/inference/sd_xl_base.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/pipelines/stable_diffusion/pipeline_stable_diffusion_image_variation.py | https://lh3.googleusercontent.com/y-iFOHfLTwkuQSUegpwDdgKmOjRSTvPxat63dQLB25xkTs4lhIbRUFeNBWZzYf370g=s1200 | 问题引导 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/utils/dynamic_modules_utils.py | https://raw.githubusercontent.com/huggingface/diffusers/{revision}/examples/community/{pipeline}.py | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/utils/import_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/diffusers0.25.0/src/diffusers/utils/import_utils.py | https://librosa.org/doc/latest/install.html | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/kolors/public_address_statement.md b/PyTorch/built-in/diffusion/kolors/public_address_statement.md index d47a47f20798dd74d2526443a53738876c4582e0..6fc29dc6363809efad042758dbf8594ef49d5f4f 100644 --- a/PyTorch/built-in/diffusion/kolors/public_address_statement.md +++ b/PyTorch/built-in/diffusion/kolors/public_address_statement.md @@ -1,13 +1,4 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:-------------------------:|:---------------------------------------------------------------------------------------------:|:--------------------:|:-----------------:| -| 开源代码引入 | https://github.com/Kwai-Kolors/Kolors/blob/master/MODEL_LICENSE | ./MODEL_LICENSE | kwai-kolors@kuaishou.com | Model License | -| 开源代码引入 | https://github.com/Kwai-Kolors/Kolors/blob/master/MODEL_LICENSE | ./MODEL_LICENSE | kwai-kolors@kuaishou.com | Mdeol License | -| 开源代码引入 | https://github.com/Kwai-Kolors/Kolors/blob/master/kolors/pipelines/pipeline_stable_diffusion_xl_chatglm_256_inpainting.py | ./kolors/pipelines/pipeline_stable_diffusion_xl_chatglm_256_inpainting.py | https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo.png | examples image | -| 开源代码引入 | https://github.com/Kwai-Kolors/Kolors/blob/master/kolors/pipelines/pipeline_stable_diffusion_xl_chatglm_256_inpainting.py | ./kolors/pipelines/pipeline_stable_diffusion_xl_chatglm_256_inpainting.py | https://raw.githubusercontent.com/CompVis/latent-diffusion/main/data/inpainting_examples/overture-creations-5sI6fQgYIuo_mask.png | examples image | -| 开源代码引入 | https://github.com/Kwai-Kolors/Kolors/blob/master/dreambooth/train_dreambooth_lora.py | ./dreambooth/train_dreambooth_lora.py | https://www.tensorflow.org/tensorboard | help for args | -| 开源代码引入 | https://github.com/Kwai-Kolors/Kolors/blob/master/dreambooth/train_dreambooth_lora.py | ./dreambooth/train_dreambooth_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | help for args | - - - - +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/kolors/dreambooth/train_dreambooth_lora.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/kolors/dreambooth/train_dreambooth_lora.py | https://www.tensorflow.org/tensorboard | 设置说明 | \ No newline at end of file diff --git a/PyTorch/built-in/diffusion/sd-scripts-xl/public_address_statement.md b/PyTorch/built-in/diffusion/sd-scripts-xl/public_address_statement.md index 06e50c243f26b0122bc16de2373daad736629e12..ffdc301b71ec395415fe3458ab72090ba77e0523 100644 --- a/PyTorch/built-in/diffusion/sd-scripts-xl/public_address_statement.md +++ b/PyTorch/built-in/diffusion/sd-scripts-xl/public_address_statement.md @@ -1,5 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ |------| -|开源代码引入|https://github.com/kohya-ss/sd-scripts/blob/main/finetune/make_captions.py|make_captions.py|https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption.pth| 下载模型文件 | -|开源代码引入|https://github.com/kohya-ss/sd-scripts/blob/main/library/custom_train_functions.py|custom_train_functions.py|https://arxiv.org/abs/2305.08891| 论文地址 | -|开源代码引入|https://github.com/kohya-ss/sd-scripts/blob/main/library/sai_model_spec.py|sai_model_spec.py|https://github.com/Stability-AI/generative-models| 开源代码仓 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/diffusion/sd-scripts-xl/finetune/make_captions.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/Aquila2/public_address_statement.md b/PyTorch/built-in/foundation/Aquila2/public_address_statement.md index d991c8c6b32599a235e3d50404f9fb27e48bc7cf..cc05099ff60895d6edaab47cdb1aed060e45f7d0 100644 --- a/PyTorch/built-in/foundation/Aquila2/public_address_statement.md +++ b/PyTorch/built-in/foundation/Aquila2/public_address_statement.md @@ -1,9 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/megatron/tokenizer/gpt2_tokenization.py | .\Aquila2\megatron\megatron\tokenizer\gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 词表文件公网下载地址 | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/megatron/tokenizer/gpt2_tokenization.py | .\Aquila2\megatron\megatron\tokenizer\gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 词表文件公网下载地址 | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/megatron/core/package_info.py | .\Aquila2\megatron\megatron\core\package_info.py | https://docs.nvidia.com/deeplearning/nemo/user-guide/docs/en/stable/ | 工具公网主页地址 | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/megatron/core/package_info.py | .\Aquila2\megatron\megatron\core\package_info.py | https://github.com/NVIDIA/Megatron-LM/megatron/core | 工具公网地址 | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/megatron/core/package_info.py | .\Aquila2\megatron\megatron\core\package_info.py | https://github.com/NVIDIA/Megatron-LM/releases | 工具公网地址 | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/examples/detxoify_lm/perspective_api.py | .\Aquila2\megatron\examples\detxoify_lm\perspective_api.py | https://commentanalyzer.googleapis.com/$discovery/rest?version=v1alpha1 | 云服务公网地址 | -| 开源代码引入 | https://github.com/FlagOpen/FlagScale/blob/release/v0.2/megatron/examples/detxoify_lm/annotations/perspective_api_annotate.py | .\Aquila2\megatron\examples\detxoify_lm\annotations\perspective_api_annotate.py | https://commentanalyzer.googleapis.com/$discovery/rest?version=v1alpha1 | 云服务公网地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------|-----------------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Aquila2/aquila/34B/finetune_aquila_34b_distributed_xpu.sh | 192.167.5.6 | MASTER ADDRESS | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Aquila2/megatron/examples/detxoify_lm/annotations/perspective_api_annotate.py | https://commentanalyzer.googleapis.com/$discovery/rest?version=v1alpha1 | 云服务公网地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Aquila2/megatron/examples/detxoify_lm/perspective_api.py | https://commentanalyzer.googleapis.com/$discovery/rest?version=v1alpha1 | 云服务公网地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Aquila2/megatron/megatron/static/index.html | https://cdnjs.cloudflare.com/ajax/libs/jquery/3.5.1/jquery.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Aquila2/megatron/megatron/tokenizer/gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Aquila2/megatron/megatron/tokenizer/gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/ChatGLM-6B/public_address_statement.md b/PyTorch/built-in/foundation/ChatGLM-6B/public_address_statement.md index f10f8d2d3d383c99f53a1916ce1b56f56f138eff..1c860138129fc7f621c7183cd4030522e1dac4e3 100644 --- a/PyTorch/built-in/foundation/ChatGLM-6B/public_address_statement.md +++ b/PyTorch/built-in/foundation/ChatGLM-6B/public_address_statement.md @@ -1,56 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/web_demo.py|ChatGLM-6B/web_demo.py | https://github.com/GaiZhenbiao/ChuanhuChatGPT/ | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/web_demo.py|ChatGLM-6B/web_demo_vision.py | https://github.com/GaiZhenbiao/ChuanhuChatGPT/ | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/pytorch/xla/pull/3609 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | ChatGLM-6B/fix/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/deepspeed.py | ChatGLM-6B/fix/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1394#issuecomment-937405374 | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/fix/training_args.py | https://github.com/intel/intel-extension-for-pytorch | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://huggingface.co/docs/transformers/performance#tf32 | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://www.wandb.com/ | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://www.mlflow.org/ | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/facebookresearch/fairscale | 源码实现 | -| 开发引入 | / | ChatGLM-6B/model/modeling_chatglm.py | https://huggingface.co/models?filter=chatglm | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/model/modeling_chatglm.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/pytorch/xla/blob/master/torch_xla/distributed/fsdp/xla_fully_sharded_data_parallel.py | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/microsoft/deepspeed | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html#ray.tune.ExperimentAnalysis.get_best_trial | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_group | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://pytorch.org/get-started/pytorch-2.0/ | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://pytorch.org/docs/2.0/generated/torch.compile.html?highlight=torch+compile#torch.compile | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | ChatGLM-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/21405 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/fix/training_args.py | https://github.com/intel/intel-extension-for-pytorch | 源码实现 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | ChatGLM-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | ChatGLM-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/aquila2/modeling_aquila.py | ChatGLM-6B/model/modeling_chatglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/attentions.py | ChatGLM-6B/model/modeling_chatglm.py | https://arxiv.org/abs/1706.03762 | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 模型相关说明 | -| 开发引入 | / | ChatGLM-6B/fix/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | ChatGLM-6B/fix/utils.py | https://arxiv.org/abs/2010.00904 | 模型相关说明 | -| 开源代码引入 | https://huggingface.co | ChatGLM-6B/fix/utils.py | https://huggingface.co/docs/transformers/main_classes/text_generation | 模型相关说明 | -| 开源代码引入 | https://huggingface.co | ChatGLM-6B/fix/utils.py | https://huggingface.co/docs/transformers/main/en/main_classes/text_generation | 模型相关说明 | -| 开源代码引入 | https://huggingface.co | ChatGLM-6B/model/modeling_chatglm.py | https://huggingface.co/docs/transformers/main/en/main_classes/text_generation | 模型相关说明 | -| 开源代码引入 | https://github.com/pytorch/pytorch | ChatGLM-6B/fix/training_args.py | https://github.com/pytorch/pytorch/issues/82707 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://huggingface.co/docs/transformers/model_doc/auto | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://github.com/huggingface/peft | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/web_demo.py|ChatGLM-6B/ptuning/web_demo.py | https://github.com/GaiZhenbiao/ChuanhuChatGPT/ | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://github.com/pytorch/torchdistx | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://github.com/intel/intel-extension-for-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers | ChatGLM-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/5420#discussion_r449779867 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://github.com/pytorch/pytorch/issues/82963 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://github.com/pytorch/pytorch/issues/82963 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_6_INSTRUCTIONS_FOR_PREDICTOR.md | ChatGLM-6B/fix/utils.py | http://arxiv.org/abs/1904.09751 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | ChatGLM-6B/fix/utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM-6B/blob/master/ptuning/trainer.py|ChatGLM-6B/ptuning/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------|-----------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/ChatGLM-6B/fix/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/ChatGLM-6B/model/modeling_chatglm.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/ChatGLM-6B/model/modeling_chatglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/ChatGLM2-6B/public_address_statement.md b/PyTorch/built-in/foundation/ChatGLM2-6B/public_address_statement.md index 702c1fbff8d425a64e385d1db34407d9b9e5d251..a19d2b74f2c51198d89eaedba03234e622cf137a 100644 --- a/PyTorch/built-in/foundation/ChatGLM2-6B/public_address_statement.md +++ b/PyTorch/built-in/foundation/ChatGLM2-6B/public_address_statement.md @@ -1,45 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/THUDM/ChatGLM2-6B/blob/main/openai_api.py|ChatGLM2-6B/openai_api.py | https://platform.openai.com/docs/api-reference/ch | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM2-6B/blob/main/openai_api.py|ChatGLM2-6B/openai_api.py | http://localhost:8000/do | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/utils.py | https://github.com/THUDM/ChatGLM-6B/blob/main/utils. | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM2-6B/blob/main/ptuning/web_demo.py|ChatGLM2-6B/web_demo.py | https://github.com/GaiZhenbiao/ChuanhuChatGP | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/pytorch/xla/pull/36 | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://docs.python.org/3/library/argparse#module-argpar | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/exampl | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/exampl | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/exampl | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://www.tensorflow.org/tensorboa | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/intel/intel-extension-for-pytor | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://nvidia.github.io/apex/a | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://huggingface.co/docs/transformers/performance#tf | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://www.wandb.co | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://www.mlflow.or | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/facebookresearch/fairsca | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/pytorch/xla/blob/master/torch_xla/distributed/fsdp/xla_fully_sharded_data_parallel. | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/microsoft/deepspe | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/huggingface/transformers/tree/main/exampl | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html#ray.tune.ExperimentAnalysis.get_best_tri | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_gro | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://pytorch.org/get-started/pytorch-2. | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://pytorch.org/docs/2.0/generated/torch.compile.html?highlight=torch+compile#torch.compi | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/214 | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/intel/intel-extension-for-pytor | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://nvidia.github.io/apex/amp.ht | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/5420/fil | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/5420/fil | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.ht | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/huggingface/transformers/issues/106 | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://arxiv.org/abs/2010.009 | 引用论文参考地址 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://huggingface.co/docs/transformers/main_classes/text_generati | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://huggingface.co/docs/transformers/main/en/main_classes/text_generati | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/training_args.py | https://github.com/pytorch/pytorch/issues/827 | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/model/modeling_chatglm.py | https://huggingface.co/models?filter=chatg | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/model/modeling_chatglm.py | https://github.com/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_n | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/model/modeling_chatglm.py | https://github.com/labmlai/annotated_deep_learning_paper_implementations/blob/master/licen | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/model/modeling_chatglm.py | https://arxiv.org/pdf/2002.05202.p | 引用论文参考地址 | -| 开发引入 | / | ChatGLM2-6B/model/modeling_chatglm.py | https://huggingface.co/docs/transformers/main/en/main_classes/text_generati | 模型相关说明 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://github.com/huggingface/transformers/pull/5420#discussion_r4497798 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/ChatGLM2-6B/blob/main/ptuning/web_demo.py|ChatGLM2-6B/ptuning/web_demo.py | https://github.com/GaiZhenbiao/ChuanhuChatGP | 源码实现 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | http://arxiv.org/abs/1904.097 | 引用论文参考地址 | -| 开发引入 | / | ChatGLM2-6B/fix/utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf3 | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/ChatGLM2-6B/fix/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/CodeGeeX2/public_address_statement.md b/PyTorch/built-in/foundation/CodeGeeX2/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..c9cd592ffc8c353fa93b3f5e8908994300b51539 --- /dev/null +++ b/PyTorch/built-in/foundation/CodeGeeX2/public_address_statement.md @@ -0,0 +1,8 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|---------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeGeeX2/fix/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeGeeX2/fix/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeGeeX2/fix/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeGeeX2/fix/modeling_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeGeeX2/ptuning/code_alpaca_20k.json | https://avatars.githubusercontent.com/u/22678055?v=4 | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeGeeX2/scripts/hostlist | 172.0.0.1 | host地址 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/CodeShell-7B/public_address_statement.md b/PyTorch/built-in/foundation/CodeShell-7B/public_address_statement.md index 9a95d7edeec7a27ecade5758e7677a2aa334f083..6b044fb91fa1dcf78e7a6b84a876ed0791695f3d 100644 --- a/PyTorch/built-in/foundation/CodeShell-7B/public_address_statement.md +++ b/PyTorch/built-in/foundation/CodeShell-7B/public_address_statement.md @@ -1,9 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ |-------------------------------------------|-----------------------------------------------------------------------------| -------- | -| 开源代码引入 | https://github.com/WisdomShell/codeshell/blob/main/finetune/finetune.py| CodeShell-7B/finetune/finetune.py | / | 源码实现 | -| 开源代码引入 | https://github.com/WisdomShell/codeshell/blob/main/finetune/run_finetune.sh| CodeShell-7B/finetune/run_finetune.sh | / | 源码实现 | -| 开源代码引入 | https://huggingface.co/WisdomShell/CodeShell-7B/blob/main/modeling_codeshell.py| CodeShell-7B/model/modeling_codeshell.py | https://openai.com/blog/better-language-models/ | 源码实现 | -| 开源代码引入 | https://huggingface.co/WisdomShell/CodeShell-7B/blob/main/modeling_codeshell.py| CodeShell-7B/model/modeling_codeshell.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://huggingface.co/WisdomShell/CodeShell-7B/blob/main/modeling_codeshell.py| CodeShell-7B/model/modeling_codeshell.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://huggingface.co/WisdomShell/CodeShell-7B/blob/main/modeling_codeshell.py| CodeShell-7B/model/modeling_codeshell.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | -| 开源代码引入 | https://github.com/lm-sys/FastChat/blob/main/fastchat/data/convert_alpaca.py| CodeShell-7B/convert_alpaca.py | / | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/CodeShell-7B/model/modeling_codeshell.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/FlagAI/public_address_statement.md b/PyTorch/built-in/foundation/FlagAI/public_address_statement.md index 63c05c4b83623b531b8ee9bd597df086393eb80d..e0cc643807b6fe8875006cc49969a37f199d7a72 100644 --- a/PyTorch/built-in/foundation/FlagAI/public_address_statement.md +++ b/PyTorch/built-in/foundation/FlagAI/public_address_statement.md @@ -1,288 +1,56 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/Dockerfile| FlagAI/Dockerfile | https://download.pytorch.org/whl/cu117 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/Dockerfile| FlagAI/Dockerfile | https://github.com/NVIDIA/apex | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/Dockerfile| FlagAI/Dockerfile | https://github.com/OpenBMB/BMTrain | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/.git/config| FlagAI/Dockerfile | https://github.com/FlagAI-Open/FlagAI.git | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/prepare_test.sh| FlagAI/prepare_test.sh | https://github.com/BAAI-OpenPlatform/checkpoints.git | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/setup.py| FlagAI/setup.py | open@baai.ac.cn | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/.git/config| FlagAI/setup.py | https://github.com/FlagAI-Open/FlagAI | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/indexed_dataset/preprocess_data_args.py| FlagAI/flagai/data/dataset/indexed_dataset/preprocess_data_args.py | https://stackoverflow.com/questions/33139531/preserve-empty-lines-with-nltks-punkt-tokenizer | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_2_DATASET.md| FlagAI/flagai/data/dataset/superglue/dataset.py | https://super.gluebenchmark.com/tasks | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-b.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/CB.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/COPA.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/MultiRC.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/RTE.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WiC.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WSC.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/BoolQ.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/ReCoRD.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-g.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://storage.googleapis.com/cluebenchmark/tasks/afqmc_public.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://storage.googleapis.com/cluebenchmark/tasks/tnews_public.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/CoLA.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://storage.googleapis.com/cluebenchmark/tasks/cmrc2018_public.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/SST-2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://www.microsoft.com/en-us/download/details.aspx?id=52398 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/QQP-clean.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/MNLI.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/MNLI.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/QNLIv2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-b.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/CB.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/COPA.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/MultiRC.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/RTE.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WiC.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WSC.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/BoolQ.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/ReCoRD.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-g.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://storage.googleapis.com/cluebenchmark/tasks/afqmc_public.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://storage.googleapis.com/cluebenchmark/tasks/tnews_public.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/superglue/control.py| FlagAI/flagai/data/dataset/superglue/properties.py | https://storage.googleapis.com/cluebenchmark/tasks/cmrc2018_public.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/bert_tokenizer.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/README.md| FlagAI/flagai/data/tokenizer/clip/tokenizer.py | https://github.com/openai/CLIP | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_tokenizer.py| FlagAI/flagai/data/tokenizer/glm_10b_en/glm_10b_en_tokenizer.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_tokenizer.py| FlagAI/flagai/data/tokenizer/glm_10b_en/glm_10b_en_tokenizer.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_large_ch/glm_large_ch.py| FlagAI/flagai/data/tokenizer/glm_large_ch/glm_large_ch.py | https://github.com/openai/gpt-2/ | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py| FlagAI/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py| FlagAI/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py | https://github.com/huggingface/transformers/issues/3788 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_large_ch/glm_large_ch.py| FlagAI/flagai/data/tokenizer/glm_large_ch/glm_large_ch.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_large_ch/glm_large_ch.py| FlagAI/flagai/data/tokenizer/glm_large_ch/glm_large_ch.py | https://github.com/google/sentencepiece.git | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_large_en/wordpiece.py| FlagAI/flagai/data/tokenizer/glm_large_en/wordpiece.py | https://github.com/huggingface/pytorch-pretrained-BERT/blob/master/pytorch_pretrained_bert/tokenization.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/glm_large_en/wordpiece.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/t5/t5_tokenizer.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/diffusion_bert_tokenizer.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/uni_tokenizer/tokenizer.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/tokenizer.py | https://en.wikipedia.org/wiki/Control_character | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/uni_tokenizer/tokenizer.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/tokenizer.py | https://www.fileformat.info/info/unicode/category/Cc/index.htm | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/uni_tokenizer/tokenizer.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/tokenizer.py | https://www.fileformat.info/info/unicode/category/Cf/index.htm | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/wp_tokenizer.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/README.md| FlagAI/flagai/model/mm/clip_guohua/clip.py | https://github.com/openai/CLIP | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/tokenizer.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py| FlagAI/flagai/data/tokenizer/uni_tokenizer/tokenizer.py | https://github.com/huggingface/transformers/issues/3788 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/bert/wordpiece.py| FlagAI/flagai/model/mm/clip_guohua/bert_tokenizer.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/mm/clip_guohua/modeling_bert.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/mm/clip_guohua/modeling_bert.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/lm/x_transformer.py| FlagAI/flagai/model/mm/lm/x_transformer.py | https://github.com/lucidrains/x-transformers/tree/main/x_transformers | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/attention.py| FlagAI/flagai/model/mm/modules/attention.py | https://github.com/MatthieuTPHR/diffusers/blob/d80b531ff8060ec1ea982b65a1b8df70f73aa67c/src/diffusers/models/attention.py#L223 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/bert_model.py| FlagAI/flagai/model/mm/clip_guohua/modeling_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/openaimodel.py| FlagAI/flagai/model/mm/Unets/Unet.py | https://github.com/openai/CLIP/blob/main/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/openaimodel.py| FlagAI/flagai/model/mm/Unets/Unet.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/vision/layers/activations.py | https://arxiv.org/abs/1710.05941 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/vision/layers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/vision/layers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/activations.py| FlagAI/flagai/model/vision/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/vision/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/vision/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/activations.py| FlagAI/flagai/model/vision/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/activations_me.py| FlagAI/flagai/model/vision/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/vision/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/dpm_solver/dpm_solver.py| FlagAI/flagai/model/mm/dpm_solver/dpm_solver.py | https://arxiv.org/abs/2006.11239 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/dpm_solver/dpm_solver.py| FlagAI/flagai/model/mm/dpm_solver/dpm_solver.py | https://arxiv.org/abs/2011.13456 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/peft_model.py| FlagAI/flagai/model/tools/peft/peft_model.py | https://github.com/huggingface/accelerate/pull/873/ | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/activations.py| FlagAI/flagai/model/vision/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/attention_pool2d.py| FlagAI/flagai/model/vision/layers/attention_pool2d.py | https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/attention_pool2d.py| FlagAI/flagai/model/vision/layers/attention_pool2d.py | https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/cbam.py| FlagAI/flagai/model/vision/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/attention_pool2d.py| FlagAI/flagai/model/vision/layers/attention_pool2d.py | https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/cond_conv2d.py| FlagAI/flagai/model/vision/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/cond_conv2d.py| FlagAI/flagai/model/vision/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/cond_conv2d.py| FlagAI/flagai/model/vision/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://arxiv.org/abs/1810.12890 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://arxiv.org/abs/1603.09382 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/eca.py| FlagAI/flagai/model/vision/layers/eca.py | https://arxiv.org/abs/1910.03151 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/eca.py| FlagAI/flagai/model/vision/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/eca.py| FlagAI/flagai/model/vision/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/eca.py| FlagAI/flagai/model/vision/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/evo_norm.py| FlagAI/flagai/model/vision/layers/evo_norm.py | https://arxiv.org/abs/2004.02967 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/evo_norm.py| FlagAI/flagai/model/vision/layers/evo_norm.py | https://proceedings.neurips.cc/paper/2020/file/9d4c03631b8b0c85ae08bf05eda37d0f-Paper.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/eca.py| FlagAI/flagai/model/vision/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/drop.py| FlagAI/flagai/model/vision/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/eca.py| FlagAI/flagai/model/vision/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/filter_response_norm.py| FlagAI/flagai/model/vision/layers/filter_response_norm.py | https://arxiv.org/abs/1911.09737 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/gather_excite.py| FlagAI/flagai/model/vision/layers/gather_excite.py | https://arxiv.org/abs/1810.12348 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/gather_excite.py| FlagAI/flagai/model/vision/layers/gather_excite.py | https://github.com/hujie-frank/GENet | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/global_context.py| FlagAI/flagai/model/vision/layers/global_context.py | https://arxiv.org/abs/1904.11492 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/global_context.py| FlagAI/flagai/model/vision/layers/global_context.py | https://github.com/xvjiarui/GCNet | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/halo_attn.py| FlagAI/flagai/model/vision/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/bottleneck_attn.py| FlagAI/flagai/model/vision/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/inplace_abn.py| FlagAI/flagai/model/vision/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/inplace_abn.py| FlagAI/flagai/model/vision/layers/inplace_abn.py | inplace_abn.git@v1.0.12 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/halo_attn.py| FlagAI/flagai/model/vision/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/lambda_layer.py| FlagAI/flagai/model/vision/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/lambda_layer.py| FlagAI/flagai/model/vision/layers/lambda_layer.py | https://github.com/lucidrains/lambda-networks | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/lambda_layer.py| FlagAI/flagai/model/vision/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/mixed_conv2d.py| FlagAI/flagai/model/vision/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/mixed_conv2d.py| FlagAI/flagai/model/vision/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/non_local_attn.py| FlagAI/flagai/model/vision/layers/non_local_attn.py | https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/non_local_attn.py| FlagAI/flagai/model/vision/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/mlp.py | ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/vision/layers/mlp.py | https://arxiv.org/abs/1612.08083 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/mlp.py | ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/vision/layers/mlp.py | https://arxiv.org/abs/2002.05202 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/non_local_attn.py| FlagAI/flagai/model/vision/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/non_local_attn.py| FlagAI/flagai/model/vision/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/non_local_attn.py| FlagAI/flagai/model/vision/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/vit_cifar100/README.md| FlagAI/flagai/model/vision/layers/patch_embed.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/selective_kernel.py| FlagAI/flagai/model/vision/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/selective_kernel.py| FlagAI/flagai/model/vision/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/split_attn.py| FlagAI/flagai/model/vision/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/split_attn.py| FlagAI/flagai/model/vision/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/pos_embed.py| FlagAI/flagai/model/vision/layers/pos_embed.py | https://github.com/lucidrains/vit-pytorch/blob/6f3a5fcf0bca1c5ec33a35ef48d97213709df4ba/vit_pytorch/rvt.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/pos_embed.py| FlagAI/flagai/model/vision/layers/pos_embed.py | https://blog.eleuther.ai/rotary-embeddings/ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/squeeze_excite.py| FlagAI/flagai/model/vision/layers/squeeze_excite.py | https://arxiv.org/abs/1709.01507 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/squeeze_excite.py| FlagAI/flagai/model/vision/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://github.com/joe-siyuan-qiao/WeightStandardization | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/squeeze_excite.py| FlagAI/flagai/model/vision/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/weight_init.py| FlagAI/flagai/model/vision/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/std_conv.py| FlagAI/flagai/model/vision/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/openaimodel.py| FlagAI/flagai/model/mm/modules/diffusionmodules/openaimodel.py | https://github.com/openai/CLIP/blob/main/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/util.py| FlagAI/flagai/model/mm/modules/diffusionmodules/util.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/util.py| FlagAI/flagai/model/mm/modules/diffusionmodules/util.py | https://github.com/lucidrains/denoising-diffusion-pytorch/blob/7706bdfc6f527f58d33f84b7b522e61e6e3164b3/denoising_diffusion_pytorch/denoising_diffusion_pytorch.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/util.py| FlagAI/flagai/model/mm/modules/diffusionmodules/util.py | https://github.com/openai/guided-diffusion/blob/0ba878e517b276c45d1195eb29f6f5f72659a05b/guided_diffusion/nn.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/util.py| FlagAI/flagai/model/mm/modules/diffusionmodules/util.py | https://arxiv.org/abs/2010.02502 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/openaimodel.py|ModelZoo-PyTorch/PyTorch/built-in/diffusion/stablediffusion-2.1/ldm/modules/karlo/kakao/modules/unet.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/attention.py| FlagAI/flagai/model/mm/modules/diffusionmodules/model.py | https://github.com/MatthieuTPHR/diffusers/blob/d80b531ff8060ec1ea982b65a1b8df70f73aa67c/src/diffusers/models/attention.py#L223 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/openaimodel.py| FlagAI/flagai/model/mm/modules/diffusionmodules/openaimodel.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/models/unet.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/AltDiffusion.py| FlagAI/flagai/model/mm/modules/distributions/distributions.py | https://github.com/openai/guided-diffusion/blob/27c20a8fab9cb472df5d6bdd6c8d11c8f430b924/guided_diffusion/losses.py#L12 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adalora.py| FlagAI/flagai/model/tools/peft/tuners/adalora.py | https://openreview.net/pdf?id=lq62uWRJjiY | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adaption_prompt.py| FlagAI/flagai/model/tools/peft/tuners/adaption_prompt.py | https://github.com/huggingface/transformers/blob/1de8ce9ee1191ba761a593ac15d9ccbf5851bfc5/src/transformers/models/llama/modeling_llama.py#L126 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adaption_prompt.py| FlagAI/flagai/model/tools/peft/tuners/adaption_prompt.py | https://github.com/huggingface/transformers/blob/1de8ce9ee1191ba761a593ac15d9ccbf5851bfc5/src/transformers/models/llama/modeling_llama.py#L133 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adaption_prompt.py| FlagAI/flagai/model/tools/peft/tuners/adaption_prompt.py | https://github.com/huggingface/peft/pull/268 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adaption_prompt.py| FlagAI/flagai/model/tools/peft/tuners/adaption_prompt.py | https://arxiv.org/pdf/2303.16199.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adaption_prompt.py| FlagAI/flagai/model/tools/peft/tuners/adaption_prompt.py | https://github.com/ZrrSkywalker/LLaMA-Adapter/blob/41c3546fe1997ab8a65809dc8d8f9252b19d9faf/llama/model.py#L234 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adaption_prompt.py| FlagAI/flagai/model/tools/peft/tuners/adaption_prompt.py | https://github.com/ZrrSkywalker/LLaMA-Adapter/blob/41c3546fe1997ab8a65809dc8d8f9252b19d9faf/llama/model.py#L141 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/prefix_tuning.py| FlagAI/flagai/model/tools/peft/tuners/prefix_tuning.py | https://github.com/THUDM/P-tuning-v2/blob/main/model/prefix_encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/adalora.py| FlagAI/flagai/model/tools/peft/tuners/adalora.py | https://openreview.net/pdf?id=lq62uWRJjiY | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/p_tuning.py| FlagAI/flagai/model/tools/peft/tuners/p_tuning.py | https://github.com/NVIDIA/NeMo/blob/main/nemo/collections/nlp/modules/common/prompt_encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/utils/save_and_load.py| FlagAI/flagai/model/tools/peft/utils/save_and_load.py | https://github.com/microsoft/LoRA/blob/main/loralib/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/lora.py| FlagAI/flagai/model/tools/peft/tuners/lora.py | https://github.com/microsoft/LoRA/blob/main/loralib/layers.py | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/tools/peft/tuners/lora.py| FlagAI/flagai/model/tools/peft/tuners/lora.py | https://github.com/bmaltais/kohya_ss/blob/feb6728762a8f463d15ba936d189d4c3abfaa1ab/networks/lora.py#L117 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/schedulers.py| FlagAI/flagai/schedulers.py | https://openreview.net/pdf?id=BJYwwY9ll | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/trainer.py| FlagAI/flagai/trainer.py | https://www.deepspeed.ai/docs/config-json/ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/trainer.py| FlagAI/flagai/trainer_v1.py | https://www.deepspeed.ai/docs/config-json/ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/dataset/indexed_dataset/preprocess_data_args.py| FlagAI/script/preprocess_data_flagai_args.py | https://stackoverflow.com/questions/33139531/preserve-empty-lines-with-nltks-punkt-tokenizer | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/file_utils.py| FlagAI/flagai/data/file_utils.py | https://github.com/huggingface/pytorch-pretrained-BERT | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/file_utils.py| FlagAI/flagai/data/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/fp16/fp16.py| FlagAI/flagai/fp16/fp16.py | https://github.com/pytorch/pytorch/issues/7733 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/fp16/fp16util.py| FlagAI/flagai/fp16/fp16util.py | http://on-demand.gputechconf.com/gtc/2018/video/S81012/ | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/fp16/fp16util.py| FlagAI/flagai/fp16/fp16util.py | http://pytorch.org/docs/master/_modules/torch/_utils.html | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/aquila2/modeling_aquila.py| FlagAI/flagai/model/aquila_modeling_hf.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/bert_model.py| FlagAI/flagai/model/bert_model.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/aquila2/modeling_aquila.py| FlagAI/flagai/model/aquila_modeling_hf.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/file_utils.py| FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/downloadCode | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/file_utils.py| FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/downloadCode | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/file_utils.py| FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/downloadCode | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/file_utils.py| FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/searchModleByName | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/file_utils.py| FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/searchModelFileByName | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/opt_model.py| FlagAI/flagai/model/opt_model.py | https://huggingface.co/models?filter=opt | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L1624 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/transformer_layers.py#L56 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/attention.py#L136 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_8_ENVIRONMENT_SETUP.md| FlagAI/flagai/mpu/grads.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_8_ENVIRONMENT_SETUP.md| FlagAI/flagai/mpu/random.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L666 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L666 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/t5_model.py| FlagAI/flagai/model/t5_model.py | https://huggingface.co/t5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/Aquila/Aquila-server/post_request.py| FlagAI/examples/Aquila/Aquila-server/post_request.py | http://127.0.0.1 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/Aquila/Aquila-server/post_request.py| FlagAI/examples/Aquila/Aquila-server/post_request.py | http://127.0.0.1 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py| FlagAI/flagai/data/tokenizer/tokenizer.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/data/tokenizer/glm_10b_en/glm_10b_en_bpe_tokenizer.py| FlagAI/flagai/data/tokenizer/tokenizer.py | https://github.com/huggingface/transformers/issues/3788 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/attentions.py| FlagAI/flagai/model/layers/attentions_bmt.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1702.03118 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1710.05941v1 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/activations.py| FlagAI/flagai/model/layers/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_8_ENVIRONMENT_SETUP.md| FlagAI/flagai/model/layers/embeddings.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_8_ENVIRONMENT_SETUP.md| FlagAI/flagai/model/layers/embeddings_bmt.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/attentions.py| FlagAI/flagai/model/layers/attentions.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/global_pointer.py| FlagAI/flagai/model/layers/global_pointer.py | https://kexue.fm/archives/7359 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/layer_norm.py| FlagAI/flagai/model/layers/layer_norm_bmt.py | https://arxiv.org/abs/1607.06450 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/layer_norm.py| FlagAI/flagai/model/layers/layer_norm_bmt.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/layer_norm.py| FlagAI/flagai/model/layers/layer_norm.py | https://arxiv.org/abs/1607.06450 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/layer_norm.py| FlagAI/flagai/model/layers/layer_norm.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/attentions.py| FlagAI/flagai/model/layers/attentions.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/hf_altclip/modeling_altclip.py| FlagAI/flagai/model/mm/AltCLIP.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/hf_altclip/modeling_altclip.py| FlagAI/flagai/model/mm/AltCLIP.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/autoencoders.py| FlagAI/flagai/model/mm/autoencoders.py | https://github.com/pytorch/pytorch/issues/37142 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/README.md| FlagAI/flagai/model/mm/clip_model.py | https://github.com/openai/CLIP | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/clip_model.py| FlagAI/flagai/model/mm/clip_model.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/README.md| FlagAI/flagai/model/mm/eva_clip_model.py | https://github.com/openai/CLIP | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/eva_clip_model.py| FlagAI/flagai/model/mm/eva_clip_model.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/clip_model.py| FlagAI/flagai/model/mm/clip_model.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modeling_altclip.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://huggingface.co/xlm-roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modeling_altclip.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modeling_altclip.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modeling_altclip.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://huggingface.co/models?filter=altclip | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/aquila2/modeling_aquila.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/AltDiffusion.py| FlagAI/flagai/model/mm/AltDiffusion.py | https://github.com/openai/guided-diffusion/blob/27c20a8fab9cb472df5d6bdd6c8d11c8f430b924/guided_diffusion/losses.py#L12 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/clip_model.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/utils.py| FlagAI/flagai/model/mm/utils.py | https://github.com/openai/guided-diffusion/blob/27c20a8fab9cb472df5d6bdd6c8d11c8f430b924/guided_diffusion/nn.py#L86 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/modules/diffusionmodules/util.py| FlagAI/flagai/model/mm/utils.py | https://arxiv.org/abs/2010.02502 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/layers/attentions.py| FlagAI/flagai/model/mm/modeling_altclip.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/mm/utils.py| FlagAI/flagai/model/mm/utils.py | https://gist.github.com/crowsonkb/65f7265353f403714fce3b2595e0b298 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/predictor/README.md| FlagAI/flagai/model/predictor/simctg.py | https://github.com/yxuansu/SimCTG | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_6_INSTRUCTIONS_FOR_PREDICTOR.md| FlagAI/flagai/model/predictor/utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/hf_altclip/modeling_altclip.py| FlagAI/flagai/model/mm/modeling_altclip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/AltCLIP/hf_altclip/modeling_altclip.py| FlagAI/flagai/model/mm/modeling_altclip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/swinv1.py| FlagAI/flagai/model/vision/swinv1.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_6_INSTRUCTIONS_FOR_PREDICTOR.md| FlagAI/flagai/model/predictor/predictor.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_6_INSTRUCTIONS_FOR_PREDICTOR.md| FlagAI/flagai/model/predictor/predictor.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/predictor/utils.py| FlagAI/flagai/model/predictor/utils.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/vit.py| FlagAI/flagai/model/vision/vit.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/vit.py| FlagAI/flagai/model/vision/vit.py | https://arxiv.org/abs/2106.10270 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/vit_cifar100/README.md| FlagAI/flagai/model/vision/vit.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/layers/pos_embed.py| FlagAI/flagai/model/vision/vit.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/vit.py| FlagAI/flagai/model/vision/vit.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/swinv1.py| FlagAI/flagai/model/vision/swinv1.py | https://arxiv.org/pdf/2103.14030 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/vit.py| FlagAI/flagai/model/vision/vit.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/docs/TUTORIAL_6_INSTRUCTIONS_FOR_PREDICTOR.md| FlagAI/flagai/model/predictor/utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/predictor/utils.py| FlagAI/flagai/model/predictor/utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/swinv1.py| FlagAI/flagai/model/vision/swinv2.py | https://arxiv.org/pdf/2103.14030 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/flagai/model/vision/vit.py| FlagAI/flagai/model/vision/vit.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/cpm3_generation/generation.py| FlagAI/flagai/model/predictor/utils.py | https://medium.com/huggingface/how-to-build-a-state-of-the-art-conversational-ai-with-transfer-learning-2d818ac26313 | 模型相关说明 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/cpm3_generation/generation.py| FlagAI/flagai/model/predictor/utils.py | https://arxiv.org/abs/1909.05858 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/cpm3_generation/generation.py| FlagAI/flagai/model/predictor/utils.py | https://arxiv.org/abs/1909.05858 | 参考论文地址 | -| 开源代码引入 | https://github.com/FlagAI-Open/FlagAI/blob/master/examples/cpm3_generation/generation.py| FlagAI/flagai/model/predictor/utils.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/.github/workflows/python-app.yml | https://pypi.tuna.tsinghua.edu.cn/simple | pip相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/Dockerfile | https://download.pytorch.org/whl/cu117 | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/examples/Aquila/Aquila-chat/data/convo_samples.jsonl | https://woodwellclimate.org/climate-impacts/arctic-wildlife/\n2. https://www.nationalgeographic.com/animals/2018/05/arctic-species-climate-change-global-warming-news/\n3. https://wwf.panda.org/our_work/wildlife/arcticwildlife/climate_change/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WSC.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WiC.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/RTE.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/ReCoRD.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/MultiRC.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/COPA.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/CB.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/BoolQ.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-g.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-b.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/SST-2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/QQP-clean.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/QNLIv2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/MNLI.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/MNLI.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://storage.googleapis.com/cluebenchmark/tasks/tnews_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://storage.googleapis.com/cluebenchmark/tasks/cmrc2018_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://storage.googleapis.com/cluebenchmark/tasks/afqmc_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/control.py | https://dl.fbaipublicfiles.com/glue/data/CoLA.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WSC.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/WiC.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/RTE.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/ReCoRD.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/MultiRC.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/COPA.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/CB.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/BoolQ.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-g.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://dl.fbaipublicfiles.com/glue/superglue/data/v2/AX-b.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://storage.googleapis.com/cluebenchmark/tasks/tnews_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://storage.googleapis.com/cluebenchmark/tasks/cmrc2018_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/dataset/superglue/properties.py | https://storage.googleapis.com/cluebenchmark/tasks/afqmc_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/bert_tokenizer.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/data/tokenizer/bert/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/aquila_modeling_hf.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/aquila_modeling_hf.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/downloadCode | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/downloadCode | baai资源下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/searchModleByName | baai资源下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/file_utils.py | https://model.baai.ac.cn/api/searchModelFileByName | baai资源下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/flagai/model/mm/modeling_altclip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/setup.cfg | https://mirrors.aliyun.com/pypi/simple/ | 环境创建所使用的的源 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/FlagAI/setup.py | open@baai.ac.cn | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/GLM-10B/hostfile b/PyTorch/built-in/foundation/GLM-10B/hostfile index 8a3f0459ff7d503ac4680eb17fe6ff92c5811263..298d0e6a62d84d13a866e7fc2fbb344da1b80981 100644 --- a/PyTorch/built-in/foundation/GLM-10B/hostfile +++ b/PyTorch/built-in/foundation/GLM-10B/hostfile @@ -1,2 +1,2 @@ -90.90.3.37 slots=8 -90.90.3.39 slots=8 +x.x.x.x slots=8 +x.x.x.x slots=8 diff --git a/PyTorch/built-in/foundation/GLM-10B/public_address_statement.md b/PyTorch/built-in/foundation/GLM-10B/public_address_statement.md index b5e14e64f0559181253edb0f3b9d559b5c0e93c6..1c171071437b39bd30350a812c0fd89d494a9ac9 100644 --- a/PyTorch/built-in/foundation/GLM-10B/public_address_statement.md +++ b/PyTorch/built-in/foundation/GLM-10B/public_address_statement.md @@ -1,52 +1,16 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------------------------------------------------------------------|-----------------------------------|------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/data_utils/wordpiece.py | GLM-10B/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 下载预训练文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/data_utils/wordpiece.py | GLM-10B/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 下载预训练文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/data_utils/wordpiece.py | GLM-10B/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 下载预训练文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | http://www.mellanox.com/downloads/ofed/MLNX_OFED | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | https://github.com/Mellanox/nv_peer_memory.git | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | https://download.open-mpi.org/release/open-mpi/ | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | https://mirrors.tuna.tsinghua.edu.cn/anaconda/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | https://tuna.moe/oh-my-tuna/oh-my-tuna.py | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | https://github.com/pytorch/pytorch | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | git clone https://github.com/NVIDIA/apex | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda102.dockerfile | GLM-10B/docker/cuda102.dockerfile | https://github.com/microsoft/DeepSpeed.git | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda112.dockerfile | GLM-10B/docker/cuda112.dockerfile | http://www.mellanox.com/downloads/ofed/MLNX_OFED | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda112.dockerfile | GLM-10B/docker/cuda112.dockerfile | https://tuna.moe/oh-my-tuna/oh-my-tuna.py | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/docker/cuda112.dockerfile | GLM-10B/docker/cuda112.dockerfile | https://github.com/microsoft/DeepSpeed.git | 下载第三方包 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/model/modeling_bert.py | GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/model/modeling_bert.py | GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/model/modeling_bert.py | GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/model/modeling_bert.py | GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/4f61ed7237a3b0187f4d62062429348276a78c84/model/modeling_bert.py | GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/model/modeling_bert.py|GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar. | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/model/modeling_bert.py|GLM-10B/model/modeling_bert.py | https://www.tensorflow.org/instal | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/model/modeling_bert.py|GLM-10B/model/modeling_bert.py | https://www.github.com/nvidia/ap | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/model/modeling_bert.py|GLM-10B/model/modeling_bert.py | https://github.com/pytorch/pytorch/pull/56 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/fp16/fp16util.py|GLM-10B/fp16/fp16util.py | http://on-demand.gputechconf.com/gtc/2018/video/S8101 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/fp16/fp16util.py|GLM-10B/fp16/fp16util.py | http://pytorch.org/docs/master/_modules/torch/_utils.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/fp16/fp16.py|GLM-10B/fp16/fp16.py | https://github.com/pytorch/pytorch/issues/77 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/fp16/fp16.py|GLM-10B/fp16/fp16.py | http://pytorch.org/docs/master/optim.html#optimizer-step-closu | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda112.dockerfile | http://www.mellanox.com/downloads/ofed/MLNX_OFE | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda112.dockerfile | https://download.open-mpi.org/release/open-mpi | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda112.dockerfile | https://github.com/pytorch/vision.g | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda112.dockerfile | https://stackoverflow.com/a/539268 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda102.dockerfile | http://www.mellanox.com/downloads/ofed/MLNX_OFE | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda102.dockerfile | https://download.open-mpi.org/release/open-mpi | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda102.dockerfile | https://github.com/pytorch/vision.g | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/docker/cuda102.dockerfile|GLM-10B/docker/cuda102.dockerfile | https://stackoverflow.com/a/539268 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/wordpiece.py|GLM-10B/data_utils/wordpiece.py | https://github.com/huggingface/pytorch-pretrained-BERT/blob/master/pytorch_pretrained_bert/tokenization. | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/wordpiece.py|GLM-10B/data_utils/wordpiece.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideograph | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/tokenization.py|GLM-10B/data_utils/tokenization.py | https://github.com/huggingface/transformers/pull/27 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/tokenization.py|GLM-10B/data_utils/tokenization.py | https://github.com/huggingface/transformers/issues/37 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/tokenization.py|GLM-10B/data_utils/tokenization.py | https://github.com/huggingface/transformers/pull/27 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/tokenization.py|GLM-10B/data_utils/tokenization.py | https://github.com/huggingface/transformers/issues/37 | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/sp_tokenizer.py|GLM-10B/data_utils/sp_tokenizer.py | https://github.com/openai/gpt- | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/sp_tokenizer.py|GLM-10B/data_utils/sp_tokenizer.py | https://github.com/google/sentencepie | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/sp_tokenizer.py|GLM-10B/data_utils/sp_tokenizer.py | https://github.com/google/sentencepiece.g | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/file_utils.py|GLM-10B/data_utils/file_utils.py | https://github.com/huggingface/pytorch-pretrained-BE | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/file_utils.py|GLM-10B/data_utils/file_utils.py | https://github.com/allenai/allenn | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/datasets.py|GLM-10B/data_utils/datasets.py | https://github.com/google-research/bert/blob/master/create_pretraining_data.py#L248-L2 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/datasets.py|GLM-10B/data_utils/datasets.py | https://github.com/google-research/bert/blob/master/create_pretraining_data.py#L3 | 源码实现 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/datasets.py|GLM-10B/data_utils/datasets.py | https://arxiv.org/pdf/1810.04805.p | 模型相关说明 | -| 开源代码引入 | https://github.com/THUDM/GLM/blob/master/data_utils/datasets.py|GLM-10B/data_utils/datasets.py | https://github.com/google-research/bert/blob/master/create_pretraining_data.py#L | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/docker/cuda102.dockerfile | http://www.mellanox.com/downloads/ofed/MLNX_OFED-${MLNX_OFED_VERSION}/MLNX_OFED_LINUX-${MLNX_OFED_VERSION}-ubuntu18.04-x86_64.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/docker/cuda102.dockerfile | https://download.open-mpi.org/release/open-mpi/v${OPENMPI_BASEVERSION}/openmpi-${OPENMPI_VERSION}.tar.gz | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/docker/cuda102.dockerfile | https://mirrors.tuna.tsinghua.edu.cn/anaconda/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/docker/cuda112.dockerfile | http://www.mellanox.com/downloads/ofed/MLNX_OFED-${MLNX_OFED_VERSION}/MLNX_OFED_LINUX-${MLNX_OFED_VERSION}-ubuntu20.04-x86_64.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://s6.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GLM-10B/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/GPT-NeoX/public_address_statement.md b/PyTorch/built-in/foundation/GPT-NeoX/public_address_statement.md index 03688d68080064853e2f486f2e4f50086835776a..317156546961b35e1462f562e71c073d3c4a9da1 100644 --- a/PyTorch/built-in/foundation/GPT-NeoX/public_address_statement.md +++ b/PyTorch/built-in/foundation/GPT-NeoX/public_address_statement.md @@ -1,73 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/.clang-format|GPT-NeoX/.clang-format | http://releases.llvm.org/8.0.0/tools/clang/docs/ClangFormatStyleOptions.html | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/evaluate.py|GPT-NeoX/evaluate.py | https://github.com/EleutherAI/lm-evaluation-harness | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/Dockerfile|GPT-NeoX/Dockerfile | https://github.com/sudo-project/sudo/issues/42 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/Dockerfile|GPT-NeoX/Dockerfile | https://download.open-mpi.org/release/open-mpi/v | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/Dockerfile|GPT-NeoX/Dockerfile | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/Dockerfile|GPT-NeoX/Dockerfile | https://github.com/NVIDIA/apex.git@a651e2c24ecf97cbf367fd3f330df36760e1c597 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/evaluate.py|GPT-NeoX/eval_tasks/eval_adapter.py | https://github.com/EleutherAI/lm-evaluation-harness | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/learning_rates.py|GPT-NeoX/megatron/learning_rates.py | https://openreview.net/pdf?id=BJYwwY9ll | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/optimizers.py|GPT-NeoX/megatron/optimizers.py | https://arxiv.org/abs/1901.11150 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/README.md|GPT-NeoX/megatron/optimizers.py | https://arxiv.org/abs/2101.11075 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/text_generation_utils.py|GPT-NeoX/megatron/text_generation_utils.py | https://medium.com/huggingface/how-to-build-a-state-of-the-art-conversational-ai-with-transfer-learning-2d818ac26313 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/training.py | https://github.com/microsoft/mup | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/training.py | https://github.com/microsoft/mup | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/training.py | https://github.com/microsoft/mup | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/model/word_embeddings.py|GPT-NeoX/megatron/training.py | https://github.com/facebookresearch/bitsandbytes | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/training.py | https://github.com/microsoft/mup | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tests/common.py|GPT-NeoX/tests/common.py | https://github.com/EleutherAI/DeeperSpeed/blob/24026e5bb37c528a222b8635c46256b1e1825d2e/tests/unit/common.py#L16 | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/convert_to_hf.py|GPT-NeoX/tools/convert_sequential_to_hf.py | https://github.com/EleutherAI/gpt-neox/pull/481 | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/convert_to_hf.py|GPT-NeoX/tools/convert_to_hf.py | https://github.com/EleutherAI/gpt-neox/pull/481 | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/convert_to_hf.py|GPT-NeoX/tools/convert_v1.0_to_hf.py | https://github.com/EleutherAI/gpt-neox/pull/481 | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/tokenizer/gpt2_tokenization.py|GPT-NeoX/tools/corpora.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/tokenizer/gpt2_tokenization.py|GPT-NeoX/tools/corpora.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/inspect_checkpoints.py|GPT-NeoX/tools/inspect_checkpoints.py | https://github.com/awaelchli/pytorch-lightning-snippets/blob/master/checkpoint/peek.py | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://github.com/LuminosoInsight/python-ftfy | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | http://eaidata.bmk.sh/data/enron_emails.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile/train/00.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile/train/ | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | http://eaidata.bmk.sh/data/github_small.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/2020-09-08-arxiv-extracts-nofallback-until-2007-068.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/EuroParliamentProceedings_1996_2011.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/FreeLaw_Opinions.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/NIH_ExPORTER_awarded_grant_text.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/PMC_extracts.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/books1.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/books3.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/hn.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/openwebtext2.jsonl.zst.tar | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/stackexchange_dataset.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/ubuntu_irc_until_2020_9_1.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/public/AI/pile_preliminary_components/yt_subs.jsonl.zst | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/eleuther_staging/c4/en/c4-train | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://the-eye.eu/eleuther_staging/c4/realnewslike/c4-train | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/tools/corpora.py|GPT-NeoX/tools/corpora.py | https://data.deepai.org/enwik8.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/data/data_utils.py | https://arxiv.org/abs/1911.02116 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/Dockerfile|GPT-NeoX/megatron/fused_kernels/compat.h | https://github.com/NVIDIA/apex | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/fused_kernels/setup.py|GPT-NeoX/megatron/fused_kernels/setup.py | alejandro.molina@aleph-alpha.de | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/gradient_noise_scale/gradient_noise_scale.py | https://arxiv.org/abs/1812.06162 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/model/flash_attention.py|GPT-NeoX/megatron/model/flash_attention.py | https://github.com/HazyResearch/flash-attention/blob/4a6eaa9f27df6fff7ffb2c24e894938a687dd870/flash_attn/flash_attn_interface.py | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/model/init_functions.py|GPT-NeoX/megatron/model/init_functions.py | https://arxiv.org/pdf/math-ph/0609050.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/model/init_functions.py | https://github.com/microsoft/mup | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/model/gpt2_model.py|GPT-NeoX/megatron/model/gpt2_model.py | https://github.com/microsoft/mup/issues/6#issuecomment-1082156274 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/model/word_embeddings.py|GPT-NeoX/megatron/model/word_embeddings.py | https://github.com/facebookresearch/bitsandbytes | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/mpu/layers.py|GPT-NeoX/megatron/mpu/layers.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/mpu/layers.py | GPT-NeoX/megatron/mpu/layers.py | https://github.com/pytorch/pytorch/blob/c47cf9bc7f9e02f649ab4ed53fe4d35732c92ab6/torch/_refs/__init__.py#L2761 | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/mpu/layers.py|GPT-NeoX/megatron/mpu/layers.py | https://github.com/lucidrains/x-transformers/blob/6b93c21be0d0a679da6f7b9621d9bb638ab18428/x_transformers/x_transformers.py#L106 | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/1-3B.yml|GPT-NeoX/megatron/neox_arguments/arguments.py | https://www.deepspeed.ai/docs/config-json/#zero-optimizations-for-fp16-training | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/1-3B.yml|GPT-NeoX/megatron/neox_arguments/deepspeed_args.py | https://www.deepspeed.ai/docs/config-json/ | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/deepspeed_args.py | https://deepspeed.readthedocs.io/en/latest/schedulers.html | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/deepspeed_args.py | https://www.deepspeed.ai/docs/config-json/#automatic-mixed-precision-amp-training-options | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/deepspeed_args.py | https://www.deepspeed.ai/docs/config-json/#flops-profiler | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/deepspeed_args.py | https://github.com/microsoft/DeepSpeed/tree/master/deepspeed/autotuning | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/neox_args.py | https://www.deepspeed.ai/docs/config-json/#sparse-attention | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/neox_arguments/arguments.py|GPT-NeoX/megatron/neox_arguments/arguments.py | https://www.deepspeed.ai/tutorials/autotuning/ | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/tokenizer/gpt2_tokenization.py|GPT-NeoX/megatron/tokenizer/gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/tokenizer/gpt2_tokenization.py|GPT-NeoX/megatron/tokenizer/gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/local_setup.yml|GPT-NeoX/megatron/neox_arguments/neox_args.py | https://api.wandb.ai | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/neox_args.py | https://arxiv.org/abs/1812.06162 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/neox_args.py | https://arxiv.org/abs/1911.02116 | 模型相关说明 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/configs/neox_arguments.md|GPT-NeoX/megatron/neox_arguments/neox_args.py | https://github.com/microsoft/mup | 源码实现 | -| 开源代码引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/tokenizer/tokenizer.py|GPT-NeoX/megatron/tokenizer/tokenizer.py | https://github.com/openai/tiktoken | 源码实现 | -| 开发引入 | https://github.com/EleutherAI/gpt-neox/blob/master/megatron/tokenizer/tokenizer.py|GPT-NeoX/megatron/tokenizer/tokenizer.py | https://github.com/openai/tiktoken | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/Dockerfile | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/Dockerfile | https://download.open-mpi.org/release/open-mpi/v${OPENMPI_BASEVERSION}/openmpi-${OPENMPI_VERSION}.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/megatron/fused_kernels/setup.py | alejandro.molina@aleph-alpha.de | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/megatron/neox_arguments/neox_args.py | https://arxiv.org/abs/1812.06162 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/megatron/tokenizer/gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/megatron/tokenizer/gpt2_tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/tools/corpora.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/tools/corpora.py | https://data.deepai.org/enwik8.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/GPT-NeoX/tools/corpora.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/LLaMA-13B/public_address_statement.md b/PyTorch/built-in/foundation/LLaMA-13B/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..c891597b43e4aff26c3f53a66b1e171903a563bd --- /dev/null +++ b/PyTorch/built-in/foundation/LLaMA-13B/public_address_statement.md @@ -0,0 +1,36 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/accelerate_modify/dataclasses.py | https://dev-discuss.pytorch.org/t/rethinking-pytorch-fully-sharded-data-parallel-fsdp-from-first-principles/1019 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://crfm.stanford.edu/2023/03/13/alpaca.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://arxiv.org/abs/2302.13971 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://www.databricks.com/blog/2023/04/12/dolly-first-open-commercially-viable-instruction-tuned-llm | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://medium.com/vmware-data-ml-blog/starter-llm-for-the-enterprise-instruction-tuning-openllama-7b-d05fc3bbaccc | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://openai.com/research/gpt-4 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://www.mosaicml.com/blog/mpt-30b | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://www.anthropic.com/index/introducing-claude | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://www.anthropic.com/index/introducing-claude | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://www.anthropic.com/index/claude-2 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://open-assistant.io | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://open-assistant.io | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://openai.com/blog/chatgpt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://lmsys.org/blog/2023-03-30-vicuna/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://chatglm.cn/blog | blog地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://bair.berkeley.edu/blog/2023/04/03/koala | blog地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://ai.meta.com/llama/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/model/model_registry.py | https://ai.meta.com/blog/code-llama-large-language-model-coding/ | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/api_provider.py | https://api.openai.com/v1 | api链接 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gateway/nginx.conf | https://chat.lmsys.org$request_uri | 相关配置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://www.kaggle.com/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://mbzuai.ac.ae/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://www.anyscale.com/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://huggingface.co/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://lmsys.org/donations/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://upload.wikimedia.org/wikipedia/commons/thumb/7/7c/Kaggle_logo.png/400px-Kaggle_logo.png | 示例图片 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://mma.prnewswire.com/media/1227419/MBZUAI_Logo.jpg?p=facebookg | 示例图片 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/fastchat/serve/gradio_web_server.py | https://docs.anyscale.com/site-assets/logo.png | 示例图片 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/tasks/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/tasks/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/transformers_modify/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/transformers_modify/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/foundation/LLaMA-13B/transformers_modify/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/foundation/Qwen-7B/public_address_statement.md b/PyTorch/built-in/foundation/Qwen-7B/public_address_statement.md index 7ebfc113c376e4fae7165706038c6832f972d36c..1f14c87a0f2d023b752f9b21b65076a06189e079 100644 --- a/PyTorch/built-in/foundation/Qwen-7B/public_address_statement.md +++ b/PyTorch/built-in/foundation/Qwen-7B/public_address_statement.md @@ -1,31 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/THUDM/ChatGLM2-6B/blob/main/ptuning/web_demo.py|ChatGLM2-6B/web_demo.py | https://github.com/GaiZhenbiao/ChuanhuChatGP | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/setup.py|Qwen-7B/setup.py | https://github.com/hiyouga/LLaMA-Factory | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/belle_multiturn/belle_multiturn.py|Qwen-7B/data/belle_multiturn/belle_multiturn.py | https://huggingface.co/datasets/BelleGroup/multiturn_chat_0.8M | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/belle_multiturn/belle_multiturn.py|Qwen-7B/data/belle_multiturn/belle_multiturn.py | https://huggingface.co/datasets/BelleGroup/multiturn_chat_0.8M/resolve/main/multiturn_chat_0.8M.json | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/hh_rlhf_en/hh_rlhf_en.py|Qwen-7B/data/hh_rlhf_en/hh_rlhf_en.py | https://huggingface.co/datasets/Anthropic/hh-rlhf | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/hh_rlhf_en/hh_rlhf_en.py|Qwen-7B/data/hh_rlhf_en/hh_rlhf_en.py | https://huggingface.co/datasets/Anthropic/hh-rlhf/resolve/main/ | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/ultra_chat/ultra_chat.py|Qwen-7B/data/ultra_chat/ultra_chat.py | https://github.com/thunlp/ultrachat | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/ultra_chat/ultra_chat.py|Qwen-7B/data/ultra_chat/ultra_chat.py | https://huggingface.co/datasets/stingning/ultrachat | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/data/ultra_chat/ultra_chat.py|Qwen-7B/data/ultra_chat/ultra_chat.py | https://huggingface.co/datasets/stingning/ultrachat/resolve/main/train_{idx}.jsonl | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/evaluation/ceval/ceval.py|Qwen-7B/evaluation/ceval/ceval.py | https://github.com/thunlp/ultrachat | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/evaluation/cmmlu/cmmlu.py|Qwen-7B/evaluation/cmmlu/cmmlu.py | https://github.com/haonan-li/CMMLU | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/evaluation/mmlu/mmlu.py|Qwen-7B/evaluation/mmlu/mmlu.py | https://github.com/haonan-li/CMMLU | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/eval/evaluator.py|Qwen-7B/src/llmtuner/eval/evaluator.py | https://github.com/hendrycks/test/blob/master/evaluate_flan.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/extras/patches/llama_patch.py|Qwen-7B/src/llmtuner/extras/patches/llama_patch.py | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/model/utils.py|Qwen-7B/src/llmtuner/model/utils.py | https://github.com/huggingface/transformers/blob/v4.31.0/src/transformers/modeling_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/model/utils.py|Qwen-7B/src/llmtuner/model/utils.py | https://github.com/huggingface/peft/blob/v0.2.0/src/peft/utils/other.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/utils.py|Qwen-7B/src/llmtuner/train/utils.py | https://github.com/huggingface/peft/issues/1090 | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/dpo/workflow.py|Qwen-7B/src/llmtuner/train/dpo/workflow.py | https://github.com/huggingface/trl/blob/main/examples/research_projects/stack_llama_2/scripts/dpo_llama2.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/ppo/workflow.py|Qwen-7B/src/llmtuner/train/ppo/workflow.py | https://github.com/lvwerra/trl/blob/main/examples/research_projects/stack_llama/scripts/rl_training.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/pt/workflow.py|Qwen-7B/src/llmtuner/train/pt/workflow.py | https://github.com/huggingface/transformers/blob/v4.30.2/src/transformers/trainer.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/rm/trainer.py|Qwen-7B/src/llmtuner/train/rm/trainer.py | https://github.com/huggingface/transformers/blob/v4.30.2/src/transformers/trainer.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/rm/workflow.py|Qwen-7B/src/llmtuner/train/rm/workflow.py | https://github.com/CarperAI/trlx/blob/main/examples/summarize_rlhf/reward_model/train_reward_model_gptj.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/train/sft/workflow.py|Qwen-7B/src/llmtuner/train/sft/workflow.py | https://github.com/huggingface/transformers/blob/v4.34.1/examples/pytorch/summarization/run_summarization.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/src/llmtuner/webui/interface.py|Qwen-7B/src/llmtuner/webui/interface.py | https://github.com/hiyouga/LLaMA-Factory | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/tests/cal_flops.py|Qwen-7B/tests/cal_flops.py | https://www.deepspeed.ai/tutorials/flops-profiler/ | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/tests/cal_lr.py|Qwen-7B/tests/cal_lr.py | https://github.com/imoneoi/openchat/blob/master/ochat/training_deepspeed/train.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/tests/llamafy_baichuan2.py|Qwen-7B/tests/llamafy_baichuan2.py | https://huggingface.co/fireballoon/baichuan-llama-7b/blob/main/convert_baichuan_to_llama.py | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/tests/llamafy_baichuan2.py|Qwen-7B/tests/llamafy_baichuan2.py | https://huggingface.co/hiyouga/Baichuan2-7B-Base-LLaMAfied | 源码实现 | -| 开源代码引入 | https://github.com/hiyouga/LLaMA-Factory/tests/quantize.py|Qwen-7B/tests/quantize.py | https://github.com/PanQiWei/AutoGPTQ | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|----------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/foundation/Qwen-7B/evaluation/ceval/ceval.py | https://cevalbenchmark.com | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/AnimateDiff/public_address_statement.md b/PyTorch/built-in/mm/AnimateDiff/public_address_statement.md index de5610b0d4f1008f4aa622cfa62a5fc329784747..3a6070019dfc847fb5e7a0bb4a27af74c1dbd884 100644 --- a/PyTorch/built-in/mm/AnimateDiff/public_address_statement.md +++ b/PyTorch/built-in/mm/AnimateDiff/public_address_statement.md @@ -1,14 +1,13 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |--------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------|------------------------|---------------------------| -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/resnet.py | animatediff/models/resnet.py | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/resnet.py | 代码实现参考连接 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/eval/m4c_evaluator.py | animatediff/models/resnet.py | https://github.com/huggingface/diffusers/issues/984 | diffusers仓库issue | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/attention.py | animatediff/models/attention.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention.py | huggingface diffusers链接 | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/attention.py | animatediff/models/attention.py | https://github.com/facebookresearch/xformers | xformers库 | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/sparse_controlnet.py | animatediff/models/sparse_controlnet.py | https://github.com/huggingface/diffusers/issues/2011#issuecomment-1547958131 | diffusers仓库issue | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/unet.py | animatediff/models/unet.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/unet_2d_condition.py | huggingface diffusers链接 | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/models/unet_blocks.py | animatediff/models/unet_blocks.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/unet_2d_blocks.py. | huggingface diffusers链接 | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/pipelines/pipeline_animation.py | animatediff/pipelines/pipeline_animation.py | https://github.com/showlab/Tune-A-Video/blob/main/tuneavideo/pipelines/pipeline_tuneavideo.py | Tune-A-Video github链接 | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/pipelines/pipeline_animation.py | animatediff/pipelines/pipeline_animation.py | https://arxiv.org/abs/2010.02502 | arxiv论文 | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/animatediff/pipelines/pipeline_animation.py | animatediff/pipelines/pipeline_animation.py | https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md | llama模型readme | -| 开源代码引入 | https://github.com/guoyww/AnimateDiff/blob/main/app.py | app.py | https://arxiv.org/abs/2307.04725 | arxiv论文 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/1-ToonYou.sh | https://civitai.com/api/download/models/78775 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/2-Lyriel.sh | https://civitai.com/api/download/models/72396 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/3-RcnzCartoon.sh | https://civitai.com/api/download/models/71009 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/4-MajicMix.sh | https://civitai.com/api/download/models/79068 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/5-RealisticVision.sh | https://civitai.com/api/download/models/130072 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/6-Tusun.sh | https://civitai.com/api/download/models/97261 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/6-Tusun.sh | https://civitai.com/api/download/models/50705 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/7-FilmVelvia.sh | https://civitai.com/api/download/models/90115 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/7-FilmVelvia.sh | https://civitai.com/api/download/models/55911 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/8-GhibliBackground.sh | https://civitai.com/api/download/models/57618 | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/AnimateDiff/download_bashscripts/8-GhibliBackground.sh | https://civitai.com/api/download/models/102828 | 模型地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/CLIP_for_PyTorch/public_address_statement.md b/PyTorch/built-in/mm/CLIP_for_PyTorch/public_address_statement.md index 776779358f6e304159b6fa06176a168dbc1a07d6..0f22cea8b0a30995cde5ca19e1399e13f5432ccf 100644 --- a/PyTorch/built-in/mm/CLIP_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/mm/CLIP_for_PyTorch/public_address_statement.md @@ -1,2894 +1,399 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/CITATION.cff |https://github.com/huggingface/transformers|CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_zh-hant.md|CLIP_for_PyTorch/transformers/CITATION.cff |https://www.aclweb.org/anthology/2020.emnlp-demos.6|CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/setup.py |https://github.com/huggingface/transformers|segmenter在开源社区中的git链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/GoogleCloudPlatform/ml-testing-accelerators.git|model_list检查开源社区url链接引用| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开源代码引入| https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | CLIP_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com| config.yml中对usr.email的配置选项| -| 开源代码引入| https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | CLIP_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com| config.yml中对usr.email的配置选项| -| 开发引入 | / | CLIP_for_PyTorch/url.ini | thomas@huggingface.co | setuptools的author_email配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/fsner/setup.py | CLIP_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | setuptools的author_email配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|CLIP_for_PyTorch/transformers/.circleci/config.yml |https://github.com/facebookresearch/detectron2.git|detectron2模型在开源社区中的源码链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_repo.py|CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/doc-builder|Dockerfile文件中transformers的git链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py|CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py|CLIP_for_PyTorch/transformers/docker/transformers-gpu/Dockerfile |https://github.com/NVIDIA/apex|Dockerfile文件中apex的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/deepspeed/test_deepspeed.py|CLIP_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/microsoft/DeepSpeed|Dockerfile文件中apex在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开发引入 | / |CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docker/transformers-pytorch-tpu/Dockerfile|CLIP_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh|Dockerfile文件中miniconda在开源社区中的的sh链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|CLIP_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://github.com/huggingface/transformers.git|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/docker/transformers-tensorflow-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt|CLIP_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt|CLIP_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/valid.txt|CONLL2003数据集在开源社区上的valid.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/test.txt|CONLL2003数据集在开源社区上的test.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/train.txt|CONLL2003数据集在开源社区上的train.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertabs/configuration_bertabs.py|CLIP_for_PyTorch/transformers/examples/research_projects/bertabs/configuration_bertabs.py |https://huggingface.co/remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization/resolve/main/config.json|bertabs-finetuned-cnndm模型在开源社区上的config.json的下载链接| -| 开发引入 | / |CLIP_for_PyTorch/url.ini |https://github.com/huggingface/transformers/tree/master/examples/research_projects/fsner|setuptools的url配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/trainer/test_trainer.py|CLIP_for_PyTorch/transformers/examples/research_projects/fsner/setup.py |https://github.com/huggingface/transformers/issues|setuptools的Bug Tracker在开源社区中的链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py|CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://huggingface.co/front/assets/huggingface_logo.svg|获取huggingface开源社区header_html| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py|CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://en.wikipedia.org/wiki/{}".format(res[0].replace|获取wiki_url开源社区链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt |https://github.com/huggingface/transformers.git|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/requirements.txt|CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt |https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py|CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/_test_bash_script.py|CLIP_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/_test_bash_script.py |https://cdn-datasets.huggingface.co/translation/wmt_en_ro-tr40k-va0.5k-te0.5k.tar.gz|wmt数据集在开源社区中的tar.gz链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh|CLIP_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh |https://cdn-datasets.huggingface.co/summarization/cnn_tiny.tgz|cnn_tiny数据集在开源社区中的tgz链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt |https://github.com/huggingface/transformers.git|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1x_G2cjvM1nW5hjAB8-vWxRqtQTlmIaQU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1oA2aqZlVNj5FarxBlNXEHpBS4lRetTzU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1Wup2D318QYBFPW_NKI1mfP_hXOfmUI9r|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1mNufoynJ9-Zy1kJh2TA_lHm2squji0i9|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1iO7um-HWoNoRKDtw27YUSgyeubn9uXqj|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1j6z9fYdlUyOYsh7KJoumRlr1yHczxR5T|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1yT7ZjqfvUYOBXvMjeY8uGRHQFWoSo8Q5|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=15gAzHeRUCs-QV8vHeTReMPEh1j8excNE|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|CLIP_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py |https://github.com/huggingface/transformers|huggingface_transformers开源社区上的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |https://github.com/huggingface/transformers|huggingface_transformers开源社区上的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1907?run_id=6937 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1914?run_id=6724 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1909?run_id=6862 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1902?run_id=6750 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |https://github.com/huggingface/transformers|wmt19数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/tatoeba/upload_models.sh|CLIP_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh |https://huggingface.co/Helsinki-NLP |Helsinki-NLP在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|CLIP_for_PyTorch/transformers/src/transformers/file_utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|CLIP_for_PyTorch/transformers/src/transformers/file_utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开发引入 | / |CLIP_for_PyTorch/url.ini |https://moon-staging.huggingface.co|获取huggingface开源社区版本| -| 开发引入 | / |CLIP_for_PyTorch/url.ini |https://huggingface.co/sgugger/my-finetuned-bert|my-finetuned-bert在开源社区上的repo链接| -| 开发引入 | / |CLIP_for_PyTorch/url.ini |https://huggingface.co/api/models |获取huggingface接口的开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapex/tokenization_tapex.py|CLIP_for_PyTorch/transformers/src/transformers/tokenization_utils.py |https://github.com/huggingface/transformers/pull/2674|CLIP模型的tokenization_utils开源社区url链接参考| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/commands/add_new_model_like.py|CLIP_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py |https://huggingface.co/{new_model_patterns.checkpoint}/resolve/main/config.json|使用json添加新model开源社区链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py|CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/clip/test_modeling_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/clip/test_modeling_tf_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg |图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg |图像分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开发引入 | / |CLIP_for_PyTorch/url.ini |https://bogus |bogus音视频开源下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx|CLIP_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/master/|model列表检查开源社区url链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py|CLIP_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/|model列表检查开源社区url链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4|"CoLA"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8|"SST"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc|"MRPC"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5|"QQP"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5|"STS"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce|"MNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df|"SNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601|"QNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb|"RTE"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf|"WNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2498860800&Signature=DuQ2CSPt2Yfre0C%2BiISrVYrIFaZH1Lc7hBVZDD4ZyR7fZYOMNOUGpi8QxBmTNOrNPjR3z1cggo7WXFfrgECP6FBJSsURv8Ybrue8Ypt%2FTPxbuJ0Xc2FhDi%2BarnecCBFO77RSbfuz%2Bs95hRrYhTnByqu3U%2FYZPaj3tZt5QdfpH2IUROY8LiBXoXS46LE%2FgOQc%2FKN%2BA9SoscRDYsnxHfG0IjXGwHN%2Bf88q6hOmAxeNPx6moDulUF6XMUAaXCSFU%2BnRO2RDL9CapWxj%2BDl7syNyHhB7987hZ80B%2FwFkQ3MEs8auvt5XW1%2Bd4aCU7ytgM69r8JDCwibfhZxpaa4gd50QXQ%3D%3D|"diagnostic"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt|MRPC任务训练集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|CLIP_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt|MRPC任务测试集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service_deprecated.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://api.github.com/repos/huggingface/transformers/actions/runs/{run_id}/jobs?per_page=100|测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|CLIP_for_PyTorch/transformers/utils/notification_service.py |https://api.github.com/repos/huggingface/transformers/actions/runs/{run_id}/jobs?per_page=100|测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx|CLIP_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/master/model_doc|transformers_model_doc开源社区url连接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py|CLIP_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/model_doc|transformers_model_doc开源社区url连接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_table.py | CLIP_for_PyTorch/transformers/utils/update_metadata.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/README.md | CLIP_for_PyTorch/transformers/utils/update_metadata.py | https://github.com/huggingface/transformers/commit/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service.py | CLIP_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/get_github_job_time.py | CLIP_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service.py | CLIP_for_PyTorch/transformers/utils/notification_service_deprecated.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service.py | CLIP_for_PyTorch/transformers/utils/notification_service.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/get_github_job_time.py | CLIP_for_PyTorch/transformers/utils/notification_service.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py | CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py | CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://download.microsoft.com/download/D/4/6/D46FF87A-F6B9-4252-AA8B-3604ED519838/MSRParaphraseCorpus.msi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py | CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py | CLIP_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_table.py | CLIP_for_PyTorch/transformers/utils/check_table.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_repo.py | CLIP_for_PyTorch/transformers/utils/check_repo.py | https://github.com/huggingface/doc-builder | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_modeling_roberta.py | CLIP_for_PyTorch/transformers/tests/xlm_roberta_xl/test_modeling_xlm_roberta_xl.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/xlm/test_tokenization_xlm.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/wavlm/test_modeling_wavlm.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | CLIP_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/hf-internal-testing/processor_with_lm/tree/main | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | CLIP_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | CLIP_for_PyTorch/transformers/tests/wav2vec2/test_tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/wav2vec2/test_modeling_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/wav2vec2/test_modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_flax_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | CLIP_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | CLIP_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/14016 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/repo_utils/test_check_copies.py | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/albert.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/utils/test_model_card.py | CLIP_for_PyTorch/transformers/tests/utils/test_model_card.py | https://arxiv.org/pdf/1810.03993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/tests/utils/test_add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/unispeech_sat/test_modeling_unispeech_sat.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/unispeech/test_modeling_unispeech.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/trainer/test_trainer.py | CLIP_for_PyTorch/transformers/tests/trainer/test_trainer.py | https://github.com/huggingface/transformers/issues/12970 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/tokenization/test_tokenization_fast.py | CLIP_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/transformers/pull/12550 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/tokenization/test_tokenization_fast.py | CLIP_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/tokenizers/issues/537 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_modeling_common.py | CLIP_for_PyTorch/transformers/tests/test_modeling_tf_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_modeling_common.py | CLIP_for_PyTorch/transformers/tests/test_modeling_common.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_modeling_common.py | CLIP_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/levit/test_modeling_levit.py | CLIP_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/mobilebert/test_tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/tests/tapas/test_tokenization_tapas.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/tapas/test_modeling_tf_tapas.py | CLIP_for_PyTorch/transformers/tests/tapas/test_modeling_tf_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/tapas/test_modeling_tf_tapas.py | CLIP_for_PyTorch/transformers/tests/tapas/test_modeling_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/t5/test_modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/sew_d/test_modeling_sew_d.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/sew/test_modeling_sew.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/text-classification/run_flax_glue.py | CLIP_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/roberta/test_tokenization_roberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_modeling_roberta.py | CLIP_for_PyTorch/transformers/tests/roberta/test_modeling_roberta.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/reformer/test_tokenization_reformer.py | CLIP_for_PyTorch/transformers/tests/reformer/test_tokenization_reformer.py | https://github.com/huggingface/transformers/pull/11737#issuecomment-850769064 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/perceiver/test_modeling_perceiver.py | CLIP_for_PyTorch/transformers/tests/reformer/test_modeling_reformer.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/mobilebert/test_tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/tests/realm/test_tokenization_realm.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/plbart/test_tokenization_plbart.py | CLIP_for_PyTorch/transformers/tests/plbart/test_tokenization_plbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot.py | CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13846 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot.py | CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13381#issuecomment-912343499 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_token_classification.py | CLIP_for_PyTorch/transformers/tests/pipelines/test_pipelines_token_classification.py | https://github.com/huggingface/transformers/pull/4987 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/phobert/test_tokenization_phobert.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/levit/test_modeling_levit.py | CLIP_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/perceiver/test_modeling_perceiver.py | CLIP_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_tokenization_pegasus.py | CLIP_for_PyTorch/transformers/tests/pegasus/test_tokenization_pegasus.py | https://github.com/google-research/bigbird/raw/master/bigbird/vocab/pegasus.model | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/pegasus/test_modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/openai/test_tokenization_openai.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/plbart/test_tokenization_plbart.py | CLIP_for_PyTorch/transformers/tests/mbart50/test_tokenization_mbart50.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/plbart/test_tokenization_plbart.py | CLIP_for_PyTorch/transformers/tests/mbart/test_tokenization_mbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/mbart/test_modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/marian/test_modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/plbart/test_tokenization_plbart.py | CLIP_for_PyTorch/transformers/tests/m2m_100/test_tokenization_m2m_100.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/led/test_modeling_led.py | CLIP_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/allenai/longformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/led/test_modeling_led.py | CLIP_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/huggingface/transformers/pull/9278#issue-544709661 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/layoutxlm/test_processor_layoutxlm.py | CLIP_for_PyTorch/transformers/tests/layoutxlm/test_processor_layoutxlm.py | https://www.industrydocuments.ucsf.edu/docs/snbx0223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/mobilebert/test_tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/tests/layoutlmv2/test_tokenization_layoutlmv2.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_modeling_common.py | CLIP_for_PyTorch/transformers/tests/layoutlmv2/test_modeling_layoutlmv2.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_modeling_roberta.py | CLIP_for_PyTorch/transformers/tests/ibert/test_modeling_ibert.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/hubert/test_modeling_tf_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/hubert/test_modeling_hubert.py | CLIP_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/pull/3572 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/gpt2/test_tokenization_gpt2.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/fsmt/test_tokenization_fsmt.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/fixtures/tests_samples/wiki_text/wiki_00 | CLIP_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/fixtures/tests_samples/wiki_text/wiki_00 | CLIP_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | CLIP_for_PyTorch/transformers/tests/encoder_decoder/test_modeling_tf_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/levit/test_modeling_levit.py | CLIP_for_PyTorch/transformers/tests/deit/test_modeling_deit.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/deepspeed/test_deepspeed.py | CLIP_for_PyTorch/transformers/tests/deepspeed/test_deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/deberta/test_tokenization_deberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_modeling_roberta.py | CLIP_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_text.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_audio.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/ctrl/test_tokenization_ctrl.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/blenderbot_small/test_modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/blenderbot/test_modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/big_bird/test_tokenization_big_bird.py | CLIP_for_PyTorch/transformers/tests/big_bird/test_tokenization_big_bird.py | https://github.com/google-research/bigbird/blob/master/bigbird/vocab/gpt2.model?raw=true | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/roberta/test_tokenization_roberta.py | CLIP_for_PyTorch/transformers/tests/bertweet/test_tokenization_bertweet.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/mobilebert/test_tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/tests/bert/test_tokenization_bert.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/barthez/test_tokenization_barthez.py | CLIP_for_PyTorch/transformers/tests/barthez/test_tokenization_barthez.py | https://github.com/huggingface/transformers/issues/11457 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/pegasus/test_modeling_flax_pegasus.py | CLIP_for_PyTorch/transformers/tests/bart/test_modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/auto/test_tokenization_auto.py | CLIP_for_PyTorch/transformers/tests/auto/test_tokenization_auto.py | https://github.com/huggingface/transformers/pull/13251 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/fx.py | CLIP_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/55888 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/training_args_tf.py | CLIP_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/pt/multilingual.md | CLIP_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/README.md | CLIP_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/training_args_tf.py | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/pt/multilingual.md | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/README.md | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/perf_train_gpu_many.md | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/facebookresearch/fairscale | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/deepspeed.md | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/microsoft/deepspeed | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/training_args.py | CLIP_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer_tf.py | CLIP_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/README.md | CLIP_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/huggingface/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer_tf.py | CLIP_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer_pt_utils.py | CLIP_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer_pt_utils.py | CLIP_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/pytorch/pytorch/issues/16266 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/src/transformers/trainer.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer.py | CLIP_for_PyTorch/transformers/src/transformers/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer.py | CLIP_for_PyTorch/transformers/src/transformers/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer.py | CLIP_for_PyTorch/transformers/src/transformers/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer.py | CLIP_for_PyTorch/transformers/src/transformers/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/tokenization_utils.py | CLIP_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://en.wikipedia.org/wiki/Trie | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/issues/1133 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/fastai/fastai/blob/master/tests/utils/text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/64789046/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/34333710/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/59041913/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/testing_utils.py | CLIP_for_PyTorch/transformers/src/transformers/testing_utils.py | https://docs.python.org/3/library/asyncio-subprocess.html#asyncio.asyncio.subprocess.Process.wait | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/zero_shot_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_image_classification.py | https://huggingface.co/models?filter=zero-shot-image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/zero_shot_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_classification.py | https://huggingface.co/models?search=nli | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/token_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/token_classification.py | https://huggingface.co/models?filter=token-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/text2text_generation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=text2text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/text2text_generation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=summarization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/text2text_generation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=translation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_clm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/text-generation/run_generation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/text-generation/run_generation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/text_generation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/huggingface/transformers/issues/14033#issuecomment-948385227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/text_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/text_classification.py | https://huggingface.co/models?filter=text-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/table_question_answering.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/table_question_answering.py | https://huggingface.co/models?filter=table-question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/question_answering.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://huggingface.co/models?filter=question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/question_answering.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://github.com/facebookresearch/DrQA | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/object_detection.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/object_detection.py | https://huggingface.co/models?filter=object-detection | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/image_segmentation.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/image_segmentation.py | https://huggingface.co/models?filter=image-segmentation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/image-classification/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/image_classification.py | https://huggingface.co/models?filter=image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/fill_mask.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://github.com/huggingface/transformers/pull/10222 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/pipelines/feature_extraction.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/conversational.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/conversational.py | https://huggingface.co/models?filter=conversational | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/image-classification/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/base.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://huggingface.co/transformers/main_classes/pipelines.html#pipeline-batching | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/pipelines/audio_classification.py | CLIP_for_PyTorch/transformers/src/transformers/pipelines/audio_classification.py | https://huggingface.co/models?filter=audio-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization_tf.py | CLIP_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization_tf.py | CLIP_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1904.09237 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization_tf.py | CLIP_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://github.com/OpenNMT/OpenNMT-tf/blob/master/opennmt/optimizers/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization.py | CLIP_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization_tf.py | CLIP_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization.py | CLIP_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization.py | CLIP_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization.py | CLIP_for_PyTorch/transformers/src/transformers/optimization.py | https://discuss.huggingface.co/t/t5-finetuning-tips/684/3 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/optimization.py | CLIP_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/huggingface/transformers/blob/8395f14de6068012787d83989c3627c3df6a252b/src/transformers/optimization.py#L505 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yoso/configuration_yoso.py | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/kernels/yoso/fast_lsh_cumulation_cuda.cu | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation_cuda.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/kernels/yoso/fast_lsh_cumulation.cu | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/yoso.md | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/convert_yoso_pytorch_to_pytorch.py | https://github.com/mlpen/YOSO | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yoso/configuration_yoso.py | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yoso/configuration_yoso.py | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yoso/configuration_yoso.py | CLIP_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlnet.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlnet.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/transformers/quickstart.html#using-the-past | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://github.com/zihangdai/xlnet/issues/41#issuecomment-505102587 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xl/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xxl/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlm-roberta.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlm-roberta.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlm-roberta.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/prophetnet.tokenizer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlm-prophetnet.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/modeling_xlm_prophetnet.py | https://huggingface.co/models?filter=xprophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/blob/master/tools/lowercase_and_remove_accent.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/normalise-romanian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/remove-diacritics.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kyt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/seq2seq/romanian_postprocessing.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/awesome-transformers.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/PyThaiNLP/pythainlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/chezou/Mykytea-python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kytea | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/fxsjy/jieba | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/tree/master/tools | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlm.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/xlm.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/configuration_xglm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/configuration_xglm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/configuration_xglm.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wavlm/modeling_wavlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wavlm/modeling_wavlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://github.com/pytorch/pytorch/issues/32590 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/trocr.md | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm/commit/b94ec76c36f02fb2b0bf0dcb0b8554a2185173cd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wavlm/modeling_wavlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://github.com/bootphon/phonemizer#readme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/wav2vec2/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/wav2vec2/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/configuration_wav2vec2.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/modeling_vit_mae.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://huggingface.co/models?filter=vit_mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_tf_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/run_mae.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://github.com/facebookresearch/mae | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/configuration_vit_mae.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/configuration_vit_mae.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/models?filter=vit-mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/configuration_vit_mae.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/vision/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_tf_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/issues/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_tf_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/configuration_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/vit-base-patch16-224/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/vision/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/image-classification/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/google/vit-base-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-coco-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-coco-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/vision-text-dual-encoder.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/vision-text-dual-encoder.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/trocr.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-12.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-10.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/trocr.md | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/modeling_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://huggingface.co/models?filter=vilt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_tf_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/modeling_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/jnhwkim/ban-vqa/blob/master/train.py#L19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/modeling_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/modeling_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_1.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/modeling_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://github.com/dandelin/ViLT/releases/download/200k/vilt_200k_mlm_itm.ckpt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/configuration_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/configuration_vilt.py | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/modeling_van.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://huggingface.co/models?filter=van | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/modeling_donut_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/modeling_van.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://arxiv.org/abs/2106.13797 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/van.md | CLIP_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://github.com/Visual-Attention-Network/VAN-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Tiny-original/resolve/main/van_tiny_754.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Small-original/resolve/main/van_small_811.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Base-original/resolve/main/van_base_828.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Large-original/resolve/main/van_large_839.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/van/configuration_van.py | CLIP_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/Visual-Attention-Network/van-base/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/wav2vec2/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/trocr.md | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/trocr.md | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/trocr/configuration_trocr.py | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/microsoft/trocr-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/vocab.pkl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/corpus.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/dbe6a7a9ff1a364a8706bf5df58a1ca96d2fd9da/torch/nn/modules/adaptive.py#L138 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/adaptive.p | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/modeling_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/mem_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/transfo-xl.md | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/transfo-xl.md | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | https://stackoverflow.com/questions/2121874/python-pickling-after-changing-a-modules-directory/2121918#2121918 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/configuration_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/4908213eb4df7aa988573350278b44c4dbe3f71b/tapas/experiments/prediction_utils.py#L288 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/constants.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/text_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L414 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L293 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_annotation_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/tapas.md | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/modeling_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/modeling_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/run_task_main.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/hparam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/tree/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/byt5/tokenization_byt5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/byt5/tokenization_byt5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/open_model_proposals/ADD_BIG_BIRD.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://medium.com/huggingface/from-tensorflow-to-pytorch-265f40ef2a28 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L1624 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/transformer_layers.py#L56 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/attention.py#L136 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mt5/modeling_mt5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L666 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-3b/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | CLIP_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-11b/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/modeling_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/configuration_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/modeling_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/configuration_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/speech_to_text_2.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/speech_to_text_2.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/configuration_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/speech_to_text.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/modeling_tf_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/speech_to_text.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/configuration_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/speech_to_text.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/configuration_speech_to_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/modeling_sew_d.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/configuration_sew_d.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/modeling_sew_d.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/configuration_sew_d.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/modeling_sew_d.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew/configuration_sew.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/modeling_sew_d.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/create_dummy_models.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/modeling_segformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/modeling_donut_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/configuration_segformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/modeling_segformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_utils.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/modeling_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/modeling_tf_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/modeling_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/modeling_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/modeling_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/modeling_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/modeling_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/roberta.md | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/roberta.md | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/retribert/tokenization_retribert.py | CLIP_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/retribert/tokenization_retribert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/retribert/tokenization_retribert.py | CLIP_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/retribert/modeling_retribert.py | CLIP_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://huggingface.co/models?filter=retribert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/retribert/configuration_retribert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/resnet/modeling_resnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://huggingface.co/models?filter=resnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/resnet/configuration_resnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/resnet/configuration_resnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/configuration_rembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/configuration_rembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/configuration_rembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/google/rembert/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/configuration_rembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/reformer.md | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://huggingface.co/models?filter=reformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/modeling_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/1509.02897.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/modeling_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/2001.04451.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/modeling_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/modeling_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/lucidrains/reformer-pytorch/blob/master/reformer_pytorch/reversible.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/configuration_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/configuration_reformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-enwik8/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/tokenizer.jsont | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rag/retrieval_rag.py | CLIP_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rag/modeling_tf_rag.py | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/pdf/2005.11401.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rag/modeling_tf_rag.py | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://stackoverflow.com/questions/52129909/tensorflow-equivalent-of-torch-gather | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/generation_strategies.md | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/whisper/modeling_whisper.py | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/quantization-qdqbert/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/NVIDIA/TensorRT/tree/master/tools/pytorch-quantization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/bert.md | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/bert.md | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/prophetnet/tokenization_prophetnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/prophetnet.tokenizer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/prophetnet.md | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://huggingface.co/models?filter=prophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/prophetnet.md | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://github.com/microsoft/ProphetNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/prophetnet/configuration_prophetnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/modeling_poolformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/modeling_donut_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/poolformer.md | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | https://github.com/sail-sg/poolformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/configuration_poolformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/modeling_poolformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/configuration_poolformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-c-cpp-defect-detection/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-cs-java/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-en_XX-java/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-go-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-clone-detection/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-cs/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-javascript-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-php-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-python-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-medium/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-small/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-ruby-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/configuration_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/configuration_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/configuration_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/configuration_plbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | CLIP_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | CLIP_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | CLIP_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/bpe.codes | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | CLIP_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/bpe.codes | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/modeling_perceiver.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/modeling_perceiver.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://discuss.pytorch.org/t/is-there-any-layer-like-tensorflows-space-to-depth-function/3487/15 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/modeling_perceiver.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://gist.github.com/sumanmichael/4de9dee93f972d47c80c4ade8e149ea6 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/configuration_perceiver.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/modeling_perceiver.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/configuration_perceiver.py | CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/pegasus.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/pegasus.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/modeling_tf_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/pegasus.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/configuration_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/pegasus.md | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/configuration_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/openai-gpt.md | CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/openai-gpt.md | CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/configuration_openai.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/configuration_openai.py | CLIP_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/nystromformer/configuration_nystromformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/nystromformer/configuration_nystromformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/nystromformer/configuration_nystromformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/nystromformer/configuration_nystromformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/mt5.md | CLIP_for_PyTorch/transformers/src/transformers/models/mt5/configuration_mt5.py | https://huggingface.co/google/mt5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/configuration_mpnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://huggingface.co/models?filter=mobilebert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/modeling_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/pdf/2004.02984.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/configuration_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/configuration_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/mmbt/modeling_mmbt.py | CLIP_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://github.com/facebookresearch/mmbt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/entity_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/codeparrot/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/perf_train_gpu_one.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/huggingface/transformers/issues/13906 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/megatron_bert/modeling_megatron_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://huggingface.co/models?filter=megatron_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/codeparrot/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/perf_train_gpu_one.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/bert.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/megatron_bert/configuration_megatron_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/nvidia/megatron-bert-uncased-345m | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/mbart.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/configuration_mbart.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/mbart.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/translation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/modeling_maskformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yolos/modeling_yolos.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/2107.06278 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/maskformer.md | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | https://github.com/facebookresearch/MaskFormer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/configuration_maskformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/maskformer-swin-base-ade/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/modeling_maskformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/datasets/scene_parse_150 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/configuration_maskformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/microsoft/swin-base-patch4-window12-384-in22k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/source.spm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/target.spm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/modeling_tf_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/modeling_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/marian.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/modeling_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://en.wikipedia.org/wiki/Insular_Celtic_languages | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/marian.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/seq2seq/romanian_postprocessing.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/tatoeba/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://cdn-datasets.huggingface.co/language_codes/iso-639-3.csv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://object.pouta.csc.fi/Tatoeba-MT-models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/seq2seq/romanian_postprocessing.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/configuration_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/marian.md | CLIP_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/configuration_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/configuration_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/configuration_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/entity_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/entity_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/modeling_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://huggingface.co/models?filter=luke | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/configuration_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/configuration_luke.py | CLIP_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://arxiv.org/abs/2010.01057 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/longformer.md | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/longformer.md | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/configuration_led.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/modeling_led.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://github.com/huggingface/transformers/blob/ac3cb660cad283163f7c73cad511124e845ca388/src/transformers/models/bart/modeling_bart.py#L1153 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/configuration_led.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/configuration_led.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/configuration_led.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/switch_transformers/modeling_switch_transformers.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/layoutlm.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/layoutlmv3/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://guillaumejaume.github.io/FUNS | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/layoutlmv3/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/clovaai/co | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/tasks/document_question_answering.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://rrc.cvc.uab.es/?ch=17 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/tasks/document_question_answering.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://github.com/microsoft/unilm/blob/master/layoutlmft/layoutlmft/models/layoutlmv2/detectron2_config.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/layoutlm.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/layoutlmv3/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://guillaumejaume.github.io/FUNSD/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/layoutlm.md | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://rrc.cvc.uab.es/?ch=13 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/configuration_layoutlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/configuration_layoutlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/imagegpt/modeling_imagegpt.py | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://huggingface.co/models?filter=imagegpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/gpt2.md | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/imagegpt/modeling_imagegpt.py | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07447 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/audio-classification/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/configuration_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/audio-classification/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gptj/modeling_tf_gptj.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://huggingface.co/models?filter=gptj | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/codegen/modeling_codegen.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gptj/configuration_gptj.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gptj/configuration_gptj.py | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/models?filter=gpt_j | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/es/tasks/language_modeling.md | CLIP_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/gpt2.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/gpt2.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/gpt2.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/imagegpt/modeling_imagegpt.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neox/configuration_gpt_neox.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/codegen/modeling_codegen.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neo/configuration_gpt_neo.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neox/configuration_gpt_neox.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neo/configuration_gpt_neo.py | CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-src.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-tgt.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/fsmt.md | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://github.com/pytorch/fairseq/tree/master/examples/wmt19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/fsmt.md | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/modeling_fsmt.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://huggingface.co/models?filter=fsmt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/eval-facebook-wmt19.sh | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/modeling_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/modeling_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/google-research/google-research/blob/master/f_net/fourier.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/modeling_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/master/generated/torch.vmap.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://arxiv.org/abs/2105.03824 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/configuration_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/configuration_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/modeling_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | CLIP_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/flaubert.md | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/flaubert.md | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-generator/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-generator/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-generator/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/electra.md | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/electra.md | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-generator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-base-generator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-large-generator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/huggingface/transformers/commit/614fef1691edb806de976756d4948ecbcd0c0ca3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/distilbert.md | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/distiller.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/google-research/bert | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/distilbert.md | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/backbone.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yolos/modeling_yolos.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yolos/modeling_yolos.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yolos/modeling_yolos.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/issues/108#issuecomment-650269223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/matcher.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/detr.md | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://giou.stanford.edu/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/misc.py#L306 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yolos/image_processing_yolos.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/pytorch/pytorch/issues/50276 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/image_transforms.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/cocodataset/panopticapi/blob/master/panopticapi/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/image_processing_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L33 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/image_processing_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L50 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/image_processing_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py#L258 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/image_processing_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L218 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/image_processing_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L241 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/configuration_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/modeling_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/configuration_detr.py | CLIP_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://rwightman.github.io/pytorch-image-models/#load-a-pretrained-model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/modeling_deit.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_tf_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | https://github.com/facebookresearch/deit/blob/ab5715372db8c6cad5740714b2216d55aeae052e/datasets.py#L103 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/configuration_deit.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-patch16-224/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/modeling_deit.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/configuration_deit.py | CLIP_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-distilled-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/spm.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/spm.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/spm.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/spm.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://huggingface.co/models?filter=deberta-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/modeling_tf_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://huggingface.co/models?filter=DeBERTa | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/modeling_data2vec_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://huggingface.co/models?filter=data2vec-text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/data2vec.md | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/modeling_data2vec_audio.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/data2vec.md | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/modeling_data2vec_audio.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_audio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/data2vec/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_text.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/facebook/data2vec-text-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_audio.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/modeling_data2vec_audio.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/modeling_data2vec_audio.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/ctrl.md | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/ctrl.md | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm.py | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm.py | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm.py | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm.py | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/modeling_donut_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/convnextv2.md | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/facebookresearch/ConvNeXt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/convbert.md | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/convbert.md | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/convbert.md | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://github.com/huggingface/tokenizers/issues/872 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/contrastive-image-text/run_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/modeling_tf_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/modeling_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/contrastive-image-text/run_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/modeling_tf_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/modeling_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/configuration_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/contrastive-image-text/run_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/configuration_clip.py | CLIP_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/tokenization_canine.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/tokenization_canine.py | https://github.com/google-research/language/blob/master/language/canine/special_codepoints.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/modeling_canine.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/modeling_canine.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/google-research/big_transfer/blob/49afe42338b62af9fbe18f0258197a33ee578a6b/bit_tf2/models.py#L36-L38 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/configuration_canine.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/modeling_canine.py | CLIP_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/canine.md | CLIP_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/camembert-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/modeling_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/modeling_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/camembert-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | CLIP_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/Musixmatch/umberto-wikipedia-uncased-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/byt5/tokenization_byt5.py | CLIP_for_PyTorch/transformers/src/transformers/models/byt5/tokenization_byt5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deprecated/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | https://github.com/alexa/bort/blob/master/bort/bort.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/configuration_blenderbot.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/configuration_blenderbot.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-pubmed/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-bigpatent/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/modeling_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/modeling_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://huggingface.co/vinai/bertweet-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://huggingface.co/vinai/bertweet-base/resolve/main/bpe.codes | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | cgpotts@stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | ewan@inf.ed.ac.uk | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://nltk.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/nltk/nltk/issues/2409 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://en.wikipedia.org/wiki/List_of_emoticons | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://gist.github.com/winzig/8894715 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | foo.na@example.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/scrapy/w3lib/blob/master/w3lib/html.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | CLIP_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://en.wikipedia.org/wiki/ISO/IEC_8859-1#Similar_character_sets | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://pypi.org/project/fugashi/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/ipadic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_generation/tokenization_bert_generation.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://huggingface.co/google/bert_for_seq_generation_L-24_bbc_encoder/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-chinese/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/bert.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/bert.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-chinese/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/bert.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/beit/modeling_beit.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/donut/modeling_donut_swin.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/modeling_tf_vit.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/upernet.md | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1807.10221 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/upernet/modeling_upernet.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1411.4038 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/semantic-segmentation/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/beit/modeling_beit.py | CLIP_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py | CLIP_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://huggingface.co/vinai/bartpho-syllable/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py | CLIP_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://huggingface.co/vinai/bartpho-syllable/resolve/main/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_bart_dlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_bart_dlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_bart_dlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/configuration_bart.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_bart_dlm_flax.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_tokenization_utils.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v1/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v1/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v1/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_tokenization_utils.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-base-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-large-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-base-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-large-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/code_llama/tokenization_code_llama.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/albert.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/modeling_tf_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://github.com/google-research/albert/blob/master/modeling.py#L971-L993 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/albert.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-base-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-large-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xlarge-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-base-v2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-large-v2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xlarge-v2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/question-answering/README.md | CLIP_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://arxiv.org/pdf/2001.08361.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/huggingface/transformers/pull/11471 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/image-classification/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_tf_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://www.tensorflow.org/tfx/serving/serving_basic | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_tf_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/tensorflow/blob/00fad90125b18b80fe054de1055770cfb8fe4ba3/tensorflow/python/keras/engine/network.py#L1339-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/modeling_tf_pegasus.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/image-classification/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_tf_pytorch_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://github.com/tensorflow/tensorflow/blob/ee16fcac960ae660e0e4496658a366e2f745e1f0/tensorflow/python/keras/engine/network.py#L1352-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_flax_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_flax_utils.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/google/flax/issues/1261 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/tensorflow/image-classification/run_image_classification.py | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modelcard.py | CLIP_for_PyTorch/transformers/src/transformers/modelcard.py | https://arxiv.org/abs/1810.03993 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/modelcard.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/integrations/integration_utils.py | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://github.com/huggingface/transformers/issues/11565 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/integrations/integration_utils.py | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/README.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/README.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://docs.wandb.ai/integrations/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/site/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/trainer_tf.py | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://pypi.org/project/azureml-sdk/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/main_classes/callback.md | CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://neptune.ai | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/image_utils.py | CLIP_for_PyTorch/transformers/src/transformers/image_utils.py | https://pytorch.org/vision/stable/transforms.html#torchvision.transforms.Resize | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/hf_argparser.py | CLIP_for_PyTorch/transformers/src/transformers/hf_argparser.py | https://stackoverflow.com/questions/15008758/parsing-boolean-values-with-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/llm_tutorial.md | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/configuration_utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/issues/14081 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/generation_strategies.md | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/whisper/modeling_whisper.py | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/beam_search.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/llm_tutorial.md | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/abs/1909.05858 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/modeling_tf_hubert.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/tf_logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/logits_process.py | CLIP_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/facebookresearch/ParlAI/blob/master/parlai/core/torch_generator_agent.py#L1350 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/whisper/modeling_whisper.py | CLIP_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/generation_strategies.md | CLIP_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/llm_tutorial.md | CLIP_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/utils.py | CLIP_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/beam_search.py | CLIP_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/generation/beam_search.py | CLIP_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/ashwinkalyan/dbs/blob/master/dbs/beam_utils.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/generation_strategies.md | CLIP_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/setup.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/pytorch/pytorch/blob/2289a12f21c54da93bf5d696e3f9aea83dd9c10d/torch/testing/_internal/common_cuda.py#L51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tqdm/tqdm/blob/master/tqdm/autonotebook.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/sentencepiece#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/protocolbuffers/protobuf/tree/master/python#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/faiss/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/flax | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rspeer/python-ftfy/tree/master#installing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/quantization-qdqbert/Dockerfile | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/tapas.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://hf.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/optuna/optuna/blob/master/optuna/integration/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/doc.py | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | http://stackoverflow.com/a/6528148/190597 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py | CLIP_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/integrations/deepspeed.py | CLIP_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1394#issuecomment-937405374 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/deepspeed/test_deepspeed.py | CLIP_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/data/processors/xnli.py | CLIP_for_PyTorch/transformers/src/transformers/data/processors/xnli.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/run_classifier.py#L207 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/tapex/run_tabfact_with_tapex.py | CLIP_for_PyTorch/transformers/src/transformers/data/processors/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/tapex/run_tabfact_with_tapex.py | CLIP_for_PyTorch/transformers/src/transformers/data/metrics/__init__.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/tapex/run_tabfact_with_tapex.py | CLIP_for_PyTorch/transformers/src/transformers/data/datasets/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ko/model_doc/llama.md | CLIP_for_PyTorch/transformers/src/transformers/convert_slow_tokenizer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/convert_graph_to_onnx.py | CLIP_for_PyTorch/transformers/src/transformers/convert_graph_to_onnx.py | https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py | CLIP_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/README.md | CLIP_for_PyTorch/transformers/src/transformers/commands/user.py | https://git-lfs.github.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/commands/lfs.py | CLIP_for_PyTorch/transformers/src/transformers/commands/lfs.py | https://github.com/git-lfs/git-lfs/blob/master/docs/custom-transfers.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/README.md | CLIP_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/setup.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark_utils.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pythonprofilers/memory_profiler/blob/895c4ac7a08020d66ae001e24067da6dcea42451/memory_profiler.py#L239 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark_utils.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://psutil.readthedocs.io/en/latest/#psutil.Process.memory_info | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark_utils.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/tensorflow/tensorflow/issues/20218#issuecomment-416771802 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark_utils.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pytorch/xla/issues/2180 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_tf.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://github.com/NVIDIA/apex/issues/439 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py | CLIP_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations_tf.py | CLIP_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations_tf.py | CLIP_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.0841 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations_tf.py | CLIP_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1612.08083 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations_tf.py | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations.py | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_ru.md | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations.py | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1702.03118 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations.py | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1710.05941v1 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations.py | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/activations.py | CLIP_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/setup.py | CLIP_for_PyTorch/transformers/setup.py | https://test.pypi.org/legacy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/setup.py | CLIP_for_PyTorch/transformers/setup.py | https://testpypi.python.org/pypi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/setup.py | CLIP_for_PyTorch/transformers/setup.py | https://github.com/pypa/pip/issues/5466 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/marian.md | CLIP_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh | https://huggingface.co/Helsinki-NLP/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/setup.py | CLIP_for_PyTorch/transformers/scripts/stale.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/pegasus/build_test_sample_spm_no_bos.py | CLIP_for_PyTorch/transformers/scripts/pegasus/build_test_sample_spm_no_bos.py | https://raw.githubusercontent.com/google/sentencepiece/master/data/botchan.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/fsmt.md | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-ru | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-ru-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-de-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-big | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt16.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt16.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-6-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt16.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt16.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://www.statmt.org/wmt16/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-allenai-wmt16.py | CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://matrix.statmt.org/test_sets/newstest2016.tgz?1504722372 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/eval-facebook-wmt19.sh | CLIP_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py | CLIP_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/text-classification/run_flax_glue.py | CLIP_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/question-answering/run_qa.py | CLIP_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_clm_flax.py | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/lxmert/modeling_frcnn.py | CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/lxmert/modeling_frcnn.py | CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | CLIP_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | https://arxiv.org/abs/1912.08777 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/finetune_rag.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/finetune_rag.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/finetune_rag.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag/test_distributed_retriever.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/test_distributed_retriever.py | https://stackoverflow.com/questions/54338013/parallel-import-a-python-file-from-sibling-folder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/finetune_rag.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/finetune_rag.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/finetune_rag.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | CLIP_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/question-answering/run_qa.py | CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/performer/modeling_flax_performer_utils.py | CLIP_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer_utils.py | https://github.com/google-research/google-research/blob/master/performer/fast_self_attention/fast_self_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/performer/modeling_flax_performer.py | CLIP_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | CLIP_for_PyTorch/transformers/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | https://msdata.visualstudio.com/Vienna/_workitems/edit/1486599 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/arunmallya/piggyback | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/allenai/hidden-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/NervanaSystems/distiller/blob/2291fdcc2ea642a98d4e20629acb5a9e2e04b4e6/distiller/pruning/automated_gradual_pruner.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/lxmert/modeling_frcnn.py | CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/lxmert/modeling_frcnn.py | CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://yjernite.github.io/lfqa.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py | CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://research.google/pubs/pub49029/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py | CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://arxiv.org/abs/1907.09190 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py | CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://en.wikipedia.org/wiki/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/model_parallel/partitions.py | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/partitions.py | https://github.com/google-research/google-research/blob/master/flax_models/t5x/partitions.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/vision/run_image_classification.py | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/fsner/README.md | CLIP_for_PyTorch/transformers/examples/research_projects/fsner/src/fsner/model.py | https://arxiv.org/abs/2008.10570 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/distiller.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/utils.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/train.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/distiller.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/lm_seqs_dataset.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/grouped_batch_sampler.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/grouped_batch_sampler.py | https://github.com/pytorch/vision/blob/master/references/detection/group_by_aspect_ratio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/distiller.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/distiller.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/model/net.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/distillation/distiller.py | CLIP_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/issues/2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/deebert/test_glue_deebert.py | CLIP_for_PyTorch/transformers/examples/research_projects/deebert/test_glue_deebert.py | https://github.com/huggingface/transformers/issues/10560 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/codeparrot/scripts/human_eval.py | CLIP_for_PyTorch/transformers/examples/research_projects/codeparrot/scripts/human_eval.py | https://stackoverflow.com/questions/60804599/python-multiprocessing-keeps-spawning-the-whole-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertology/run_prune_gpt.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://github.com/huggingface/transformers/blob/783d7d2629e97c5f0c5f9ef01b8c66410275c204/examples/research_projects/bertology/run_bertology.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertology/run_prune_gpt.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertology/run_prune_gpt.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertology/run_bertology.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://github.com/pmichel31415/are-16-heads-really-better-than-1 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | CLIP_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertabs/utils_summarization.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://cs.nyu.edu/~kcho/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertabs/utils_summarization.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/abisee/cnn-dailymail/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertabs/utils_summarization.py | CLIP_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/nlpyang/PreSumm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/seq2seq/xla_spawn.py | CLIP_for_PyTorch/transformers/examples/pytorch/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/question-answering/run_qa.py | CLIP_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/text-generation/run_generation.py | CLIP_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/text-generation/run_generation.py | CLIP_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/pytorch/text-classification/run_xnli.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/text-classification/run_flax_glue.py | CLIP_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/text-classification/run_flax_glue.py | CLIP_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/question-answering/run_qa.py | CLIP_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_clm_flax.py | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_clm_flax.py | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/run_mim.py | CLIP_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mim.py | https://github.com/microsoft/SimMIM/blob/main/data/data_simmim.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://arxiv.org/abs/2111.06377 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/pytorch/image-pretraining/run_mae.py | CLIP_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://github.com/facebookresearch/mae/blob/main/main_pretrain.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/vision/run_image_classification.py | CLIP_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/run_ner.sh | CLIP_for_PyTorch/transformers/examples/legacy/token-classification/run.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/seq2seq/xla_spawn.py | CLIP_for_PyTorch/transformers/examples/legacy/seq2seq/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_transfo_xl.py | CLIP_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_transfo_xl.py | CLIP_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/eval.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/run_swag.py | https://github.com/google-research/bert/issues/38 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/run_swag.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/legacy/run_swag.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/run_swag.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_openai_gpt.py | CLIP_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/huggingface/pytorch-openai-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_openai_gpt.py | CLIP_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/openai/finetune-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_chinese_ref.py | CLIP_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_camembert.py | CLIP_for_PyTorch/transformers/examples/legacy/run_camembert.py | https://github.com/pytorch/fairseq/blob/master/fairseq/models/roberta/hub_interface.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/run_swag.py | CLIP_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/run_ner.sh | CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/run_ner.py | CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/run_ner.py | CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/huggingface/transformers/issues/3159 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/run_ner.py | CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/pytorch-lightning/lightning_base.py | CLIP_for_PyTorch/transformers/examples/legacy/pytorch-lightning/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/vision/run_image_classification.py | CLIP_for_PyTorch/transformers/examples/flax/vision/run_image_classification.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/text-classification/run_flax_glue.py | CLIP_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | CLIP_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/question-answering/run_qa.py | CLIP_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/t5_tokenizer_model.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/t5_tokenizer_model.py | https://github.com/yandex-research/DeDLOC/blob/main/sahajbert/tokenizer/tokenizer_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2466 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://arxiv.org/pdf/1910.10683.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/master/t5/data/preprocessors.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_t5_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_clm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | CLIP_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docker/transformers-pytorch-tpu/Dockerfile | CLIP_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://github.com/conda/conda/issues/8385 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/vision/run_image_classification.py | CLIP_for_PyTorch/run_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/language-modeling/run_mlm_flax.py | CLIP_for_PyTorch/run_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/speech-recognition/README.md | CLIP_for_PyTorch/run_clip.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/flax/image-captioning/README.md | CLIP_for_PyTorch/run_clip.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/run_clip.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/run_clip.py | https://huggingface.co/models?filter=cl | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torc | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://github.com/kpu/kenlm/archive/master.z | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torc | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://github.com/kpu/kenlm/archive/master.z | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torc | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://github.com/kpu/kenlm/archive/master.z | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=cl | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=cl | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.m | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/scripts/stale.py | https://github.com/huggingface/transformers/blob/master/CONTRIBUTING. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/setup.py | https://github.com/allenai/allennlp/blob/master/setup. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/data/datasets/language_modeling.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_mlm. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/data/datasets/language_modeling.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_mlm_wwm. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rusty1s/pytorch_scatt | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/sgugger/my-finetuned-be | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/google/flax/blob/master/examples/wmt/train.py#L2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_pytorch_utils.py| CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_pytorch_utils.py| CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/instal | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_pytorch_utils.py| CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://www.tensorflow.org/instal | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://flax.readthedocs.io/en/latest/installation.ht | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/microsoft/beit-base-patch16-224-in22k/resolve/main/config.js | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/microsoft/beit-base-patch16-224-in2 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.p | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/bert/convert_bert_original_tf2_checkpoint_to_pytorch.py | https://github.com/tensorflow/models/tree/master/official/nlp/be | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py | https://huggingface.co/ctrl/resolve/main/config.js | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py | https://huggingface.co/ct | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/benjaminp/s | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/configuration_imagegpt.py | https://huggingface.co/imageg | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://guillaumejaume.github.io/FUNSD/ | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://rrc.cvc.uab.es/?ch= | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/clovaai/cord | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/applicaai/kleister-n | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/mpnet-ba | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dpr/modeling_tf_dpr.py#L | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bart. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/rusty1s/pytorch_scatt | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/microsoft/trocr-base/resolve/main/config.js | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/facebook/unispeech-base-960h/resolve/main/config.js | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/facebook/unispeech-base-96 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-960h/resolve/main/config.js | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-96 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-96 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/van-ba | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop. | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/vilt/feature_extraction_vilt.py | https://github.com/dandelin/ViLT/blob/3db8b5035464afee84d951bf6322e1b27f1d072d/vilt/transforms/utils.py# | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/facebook/wavlm-base-960h/resolve/main/config.js | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/facebook/wavlm-base-96 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/chezou/Mykytea-pyth | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://github.com/kpu/kenlm/archive/master.z | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://github.com/huggingface/transformers/tree/master/examples/tensorfl | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/utils/__init__.py | https://huggingface.co/transformers/installation.html#installing-from-sour | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/utils/__init__.py | https://huggingface.co/transformers/examples.ht | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/595 | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/albert.html | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://arxiv.org/abs/1909.11942 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/distilbert.html | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://arxiv.org/abs/1910.01108 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://github.com/huggingface/transformers/tree/master/examples/distillation | 源码实现 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/electra.html | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://arxiv.org/abs/2003.105 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://arxiv.org/abs/1909.119 | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/master/model_doc/albert.html | 模型相关说明 | -| 开发引入 | / | CLIP_for_PyTorch/transformers/tests/vit_mae/test_modeling_vit_mae.py | https://discuss.pytorch.org/t/random-seed-that-spans-across-devices/197 | 模型相关说明 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/examples/flax/vision/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/big_bird/requirements.txt | https://github.com/huggingface/transformers@master | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt | https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | -| 开发引入 | / |CLIP_for_PyTorch/transformers/tests/sagemaker/scripts/tensorflow/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | user.email配置邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/.circleci/config.yml | https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/CITATION.cff | https://www.aclweb.org/anthology/2020.emnlp-demos.6 | 配置相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torch-$(python -c "from torch import version; print(version.__version__.split(''+'')[0])")+cpu.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://pypi.ngc.nvidia.com | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://pypi.ngc.nvidia.com | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 数据集配置链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1914?run_id=6724 | 数据集配置链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 数据集配置链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 数据集配置链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/{experiment.id} | 创建experiment | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | httphttps://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | httphttps://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | pytorch链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | pytorch链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | pytorch链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | pytorch链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | pytorch链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/modeling_utils.py | httphttps://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07447 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | http://images.cocodataset.org/val2017/000000039769.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 相关配置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | glue数据集diagnostic链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2498860800&Signature=DuQ2CSPt2Yfre0C%2BiISrVYrIFaZH1Lc7hBVZDD4ZyR7fZYOMNOUGpi8QxBmTNOrNPjR3z1cggo7WXFfrgECP6FBJSsURv8Ybrue8Ypt%2FTPxbuJ0Xc2FhDi%2BarnecCBFO77RSbfuz%2Bs95hRrYhTnByqu3U%2FYZPaj3tZt5QdfpH2IUROY8LiBXoXS46LE%2FgOQc%2FKN%2BA9SoscRDYsnxHfG0IjXGwHN%2Bf88q6hOmAxeNPx6moDulUF6XMUAaXCSFU%2BnRO2RDL9CapWxj%2BDl7syNyHhB7987hZ80B%2FwFkQ3MEs8auvt5XW1%2Bd4aCU7ytgM69r8JDCwibfhZxpaa4gd50QXQ%3D%3D | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/CLIP_for_PyTorch/url.ini | thomas@huggingface.co | 邮箱地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/public_address_statement.md b/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/public_address_statement.md index c0767056de7c698b3132cdb2002ad9d015d9f3ad..2fc9905086867d9d876e2584f1cce2d7fede3977 100644 --- a/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/public_address_statement.md @@ -1,25 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/deployment.md|Chinese-CLIP_for_PyTorch/setup.py | https://github.com/OFA-Sys/Chinese-CL | 源码实现 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/training/params.py|Chinese-CLIP_for_PyTorch/cn_clip/training/params.py | https://arxiv.org/pdf/2103.00020.p | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/training/main.py|Chinese-CLIP_for_PyTorch/cn_clip/training/main.py | https://github.com/openai/CLIP/issues/ | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/eval/extract_features.py|Chinese-CLIP_for_PyTorch/cn_clip/training/main.py | https://discuss.pytorch.org/t/valueerror-attemting-to-unscale-fp16-gradients/813 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/eval/extract_features.py|Chinese-CLIP_for_PyTorch/cn_clip/eval/zeroshot_evaluation.py | https://discuss.pytorch.org/t/valueerror-attemting-to-unscale-fp16-gradients/813 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/eval/imagenet_zeroshot_templates.py|Chinese-CLIP_for_PyTorch/cn_clip/eval/imagenet_zeroshot_templates.py | https://github.com/mlfoundations/open_clip/blob/main/src/training/imagenet_zeroshot_data. | 源码实现 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/eval/imagenet_zeroshot_templates.py|Chinese-CLIP_for_PyTorch/cn_clip/eval/imagenet_zeroshot_templates.py | https://gitee.com/mindspore/models/tree/master/research/mm/wuko | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/eval/extract_features.py|Chinese-CLIP_for_PyTorch/cn_clip/eval/extract_features.py | https://discuss.pytorch.org/t/valueerror-attemting-to-unscale-fp16-gradients/813 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/deploy/tensorrt_utils.py|Chinese-CLIP_for_PyTorch/cn_clip/deploy/tensorrt_utils.py | https://github.com/NVIDIA/TensorRT/issues/11 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/deploy/tensorrt_utils.py|Chinese-CLIP_for_PyTorch/cn_clip/deploy/tensorrt_utils.py | https://github.com/onnx/onnx-tensorrt/issues/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/deploy/tensorrt_utils.py|Chinese-CLIP_for_PyTorch/cn_clip/deploy/tensorrt_utils.py | https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html#opt_profiles_bindin | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/utils.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://github.com/openai/CL | 源码实现 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/utils.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-b-16. | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/utils.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-l-14. | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/utils.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-l-14-336. | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/utils.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-h-14. | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/utils.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_rn50. | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/modeling_bert.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/modeling_bert.py | https://arxiv.org/abs/1606.084 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/modeling_bert.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/modeling_bert.py | https://arxiv.org/abs/1606.084 | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/modeling_bert.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/modeling_bert.py | https://github.com/pytorch/pytorch/pull/56 | 源码实现 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/model.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/model.py | https://github.com/HazyResearch/flash-attention/issues/ | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/cn_clip/clip/bert_tokenizer.py|Chinese-CLIP_for_PyTorch/cn_clip/clip/bert_tokenizer.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideograph | 模型相关说明 | -| 开源代码引入 | https://github.com/OFA-Sys/Chinese-CLIP/blob/master/assets/Chinese_CLIP_logo_tp_path.svg|Chinese-CLIP_for_PyTorch/assets/Chinese_CLIP_logo_tp_path.svg | http://www.w3.org/2000/s | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-l-14-336.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-l-14.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-h-14.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_vit-b-16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/cn_clip/clip/utils.py | https://clip-cn-beijing.oss-cn-beijing.aliyuncs.com/checkpoints/clip_cn_rn50.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Chinese-CLIP_for_PyTorch/cn_clip/deploy/tensorrt_utils.py | https://docs.nvidia.com/deeplearning/tensorrt/developer-guide/index.html | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/CogVideo/public_address_statement.md b/PyTorch/built-in/mm/CogVideo/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..42369e086ba0cb0012b999ee844343a851d267fc --- /dev/null +++ b/PyTorch/built-in/mm/CogVideo/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|---------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/CogVideo/sat/sgm/modules/autoencoding/lpips/util.py | https://heibox.uni-heidelberg.de/f/607503859c864bc1b30b/?dl=1 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/DiT/public_address_statement.md b/PyTorch/built-in/mm/DiT/public_address_statement.md index 5c001c4a134e4cd67f4a5d16d07c1fa314e10724..b844a57c00dfd1d391c08bf416f4e6270960e149 100644 --- a/PyTorch/built-in/mm/DiT/public_address_statement.md +++ b/PyTorch/built-in/mm/DiT/public_address_statement.md @@ -1,13 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ | ------------------------------------------------------------ | ------------------------------- | ------------------------------------------------------------ | -------------- | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/download.py | download.py | https://dl.fbaipublicfiles.com/DiT/models/ | 下载预训练权重 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/model.py | models.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/nn.py | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/model.py | models.py | https://github.com/openai/glide-text2im/blob/main/notebooks/text2im.ipynb | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/model.py | models.py | https://github.com/facebookresearch/mae/blob/main/util/pos_embed.py | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/sample_ddp.py | sample_ddp.py | https://github.com/openai/guided-diffusion/tree/main/evaluations | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/train.py | train.py | https://github.com/openai/guided-diffusion/blob/8fb3ad9197f16bbc40620447b2742e13458d2831/guided_diffusion/image_datasets.py | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/diffusion/gaussian_diffusion.py | diffusion\gaussian_diffusion.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/diffusion/gaussian_diffusion.py | diffusion\gaussian_diffusion.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/diffusion/gaussian_diffusion.py | diffusion\gaussian_diffusion.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 引用说明 | -| 开源代码引入 | https://github.com/facebookresearch/DiT/blob/main/diffusion/gaussian_diffusion.py | diffusion\gaussian_diffusion.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 引用说明 | -| | | | | | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------|--------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/DiT/download.py | https://dl.fbaipublicfiles.com/DiT/models/{model_name} | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/HunyuanDiT/public_address_statement.md b/PyTorch/built-in/mm/HunyuanDiT/public_address_statement.md index 226cffb32fe8b808c20bcc91d1a82afb71a5b3c6..8dc6a1f09c053eed944c4e3885116cbee6fbe83d 100644 --- a/PyTorch/built-in/mm/HunyuanDiT/public_address_statement.md +++ b/PyTorch/built-in/mm/HunyuanDiT/public_address_statement.md @@ -1,26 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------------------------------------------------------------|---------------------------------|------------------------|---------------------| -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/utils/collect_env.py | ./utils/collect_env.py | https://github.com/open-mmlab/mmengine/issues/931 | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/mllm/llava/utils.py | ./mllm/llava/utils.py | https://api.openai.com/v1/moderations | 文件交互链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/mllm/llava/model/builder.py | ./mllm/llava/model/builder.py | https://github.com/haotian-liu/LLaVA#launch-a-model-worker-lora-weights-unmerged | 模型权重公网下载说明 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/lite/inference.py | ./lite/inference.py | https://huggingface.co/Tencent-Hunyuan | huggingface官仓说明链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/modules/posemb_layers.py | hydit/modules/posemb_layers.py | https://github.com/facebookresearch/llama/blob/main/llama/model.py | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/modules/posemb_layers.py | hydit/modules/posemb_layers.py | https://github.com/facebookresearch/mae/blob/main/util/pos_embed.py | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/modules/embedders.py | hydit/modules/embedders.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/nn.py | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/modules/embedders.py | hydit/modules/embedders.py | https://github.com/google-research/vision_transformer | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/lr_scheduler.py | hydit/lr_scheduler.py | https://arxiv.org/abs/1803.09820 | 参考论文链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://arxiv.org/pdf/2305.08891.pdf | 参考论文链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://arxiv.org/abs/2010.02502 | 参考论文链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://arxiv.org/pdf/2205.11487.pdf | 参考论文链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://huggingface.co/openai/clip-vit-large-patch14 | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://huggingface.co/runwayml/stable-diffusion-v1-5 | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://github.com/huggingface/diffusers/pull/254 | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/hydit/diffusion/pipeline_controlnet.py | hydit/diffusion/pipeline_controlnet.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/diffusion/gaussian_diffusion.py | comfyui-hydit/hydit/diffusion/gaussian_diffusion.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/diffusion/gaussian_diffusion.py | comfyui-hydit/hydit/diffusion/gaussian_diffusion.py | https://www.crosslabs.org/blog/diffusion-with-offset-noise | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/diffusion/gaussian_diffusion.py | comfyui-hydit/hydit/diffusion/gaussian_diffusion.py | https://openreview.net/forum?id=PlKWVd2yBkY | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/annotator/glyph.py | comfyui-hydit/hydit/annotator/glyph.py | https://github.com/AIGText/GlyphControl-release | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/annotator/dwpose/util.py | comfyui-hydit/hydit/annotator/dwpose/util.py | https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/src/openpose/hand/handDetector.cpp | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/annotator/dwpose/__init__.py | comfyui-hydit/hydit/annotator/dwpose/__init__.py | https://github.com/CMU-Perceptual-Computing-Lab/openpose | 注释参考链接 | -| 开源代码引入 | https://github.com/Tencent/HunyuanDiT/comfyui-hydit/hydit/annotator/dwpose/__init__.py | comfyui-hydit/hydit/annotator/dwpose/__init__.py | https://github.com/Hzzone/pytorch-openpose | 注释参考链接 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/HunyuanDiT/comfyui-hydit/hydit/config.py | http://arxiv.org/abs/2302.05442 for details | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/HunyuanDiT/comfyui-hydit/hydit/config_comfyui.py | http://arxiv.org/abs/2302.05442 for details | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/HunyuanDiT/comfyui-hydit/utils.py | https://raw.githubusercontent.com/CompVis/stable-diffusion/main/configs/stable-diffusion/v1-inference.yaml | 配置文件 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/HunyuanDiT/hydit/config.py | http://arxiv.org/abs/2302.05442 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/HunyuanDiT/IndexKits/setup.py | jarvizhang@tencent.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/HunyuanDiT/mllm/llava/utils.py | https://api.openai.com/v1/moderations | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/InternVL1.5/public_address_statement.md b/PyTorch/built-in/mm/InternVL1.5/public_address_statement.md index b20ac0166487444088e0938f282789f0eab99ca9..43ef88d25777964b45cc01e5df1d462bc2fbd8a1 100644 --- a/PyTorch/built-in/mm/InternVL1.5/public_address_statement.md +++ b/PyTorch/built-in/mm/InternVL1.5/public_address_statement.md @@ -1,40 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------------------------------------------------------------|---------------------------------|------------------------|---------------------| -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/tiny_lvlm/tools.py | internvl_chat/eval/tiny_lvlm/tools.py | https://github.com/OpenGVLab/Multi-Modality-Arena/blob/main/tiny_lvlm_evaluation/utils/tools.py | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/evaluate_vqa.py | internvl_chat/eval/vqa/evaluate_vqa.py | https://github.com/google-research/pix2struct/blob/main/pix2struct/metrics.py#L81 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/evaluate_vqa.py | internvl_chat/eval/vqa/evaluate_vqa.py | https://arxiv.org/pdf/2203.10244.pdf | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/infographicsvqa_eval.py | internvl_chat/eval/vqa/infographicsvqa_eval.py | https://www.docvqa.org/datasets/infographicvqa | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/infographicsvqa_eval.py | internvl_chat/eval/vqa/infographicsvqa_eval.py | https://rrc.cvc.uab.es/?ch=17&com=introduction | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/textvqa_eval.py | internvl_chat/eval/vqa/textvqa_eval.py | https://github.com/haotian-liu/LLaVA/blob/main/llava/eval/m4c_evaluator.py | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/textvqa_eval.py | internvl_chat/eval/vqa/textvqa_eval.py | https://github.com/facebookresearch/mmf/blob/c46b3b3391275b4181567db80943473a89ab98ab/pythia/tasks/processors.py#L897 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/textvqa_eval.py | internvl_chat/eval/vqa/textvqa_eval.py | https://github.com/ronghanghu/coco-caption.git@python23 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/textvqa_eval.py | internvl_chat/eval/vqa/textvqa_eval.py | https://github.com/tylin/coco-caption | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/eval/vqa/textvqa_eval.py | internvl_chat/eval/vqa/textvqa_eval.py | https://github.com/ronghanghu/coco-caption.git@python23 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/internlm2/configuration_internlm2.py | internvl_chat/internvl/model/internlm2/configuration_internlm2.py | https://arxiv.org/pdf/2305.13245.pdf | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/internlm2/modeling_internlm2.py | internvl_chat/internvl/model/internlm2/modeling_internlm2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/internlm2/modeling_internlm2.py | internvl_chat/internvl/model/internlm2/modeling_internlm2.py | https://arxiv.org/abs/1910.13461 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/internvl_chat/flash_attention.py | internvl_chat/internvl/model/internvl_chat/flash_attention.py | https://github.com/Dao-AILab/flash-attention/blob/v0.2.8/flash_attn/flash_attention.py | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/configuration_phi3.py | internvl_chat/internvl/model/phi3/configuration_phi3.py | https://huggingface.co/microsoft/Phi-3-mini-4k-instruct/resolve/main/config.json | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/configuration_phi3.py | internvl_chat/internvl/model/phi3/configuration_phi3.py | https://huggingface.co/microsoft/Phi-3-mini-128k-instruct/resolve/main/config.json | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/configuration_phi3.py | internvl_chat/internvl/model/phi3/configuration_phi3.py | https://huggingface.co/microsoft/Phi-3-mini-4k-instruct | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/configuration_phi3.py | internvl_chat/internvl/model/phi3/configuration_phi3.py | https://arxiv.org/pdf/2305.13245.pdf | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://huggingface.co/models?filter=Phi-3 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://github.com/huggingface/transformers/pull/29285 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://github.com/huggingface/transformers/pull/29285 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://github.com/huggingface/transformers/pull/29285 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://github.com/Dao-AILab/flash-attention/releases/tag/v2.1.0 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://github.com/pytorch/pytorch/issues/112577 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/model/phi3/modeling_phi3.py | internvl_chat/internvl/model/phi3/modeling_phi3.py | https://arxiv.org/abs/1910.13461 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/patch/llama_flash_attn_monkey_patch.py | internvl_chat/internvl/patch/llama_flash_attn_monkey_patch.py | https://github.com/HazyResearch/flash-attention/blob/main/flash_attn/flash_attention.py | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/patch/llama_flash_attn_monkey_patch.py | internvl_chat/internvl/patch/llama2_flash_attn_monkey_patch.py | https://github.com/lm-sys/FastChat | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/patch/llama_flash_attn_monkey_patch.py | internvl_chat/internvl/patch/llama2_flash_attn_monkey_patch.py | https://github.com/HazyResearch/flash-attention/issues/190#issuecomment-1523359593 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/patch/train_sampler_patch.py | internvl_chat/internvl/patch/train_sampler_patch.py | https://github.com/haotian-liu/LLaVA/blob/main/llava/train/llava_trainer.py#L38 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/patch/train_sampler_patch.py | internvl_chat/internvl/patch/train_sampler_patch.py | https://github.com/haotian-liu/LLaVA/blob/main/llava/train/llava_trainer.py#L88 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/patch/train_sampler_patch.py | internvl_chat/internvl/patch/train_sampler_patch.py | https://github.com/haotian-liu/LLaVA/blob/main/llava/train/llava_trainer.py#L99 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/conversation.py | internvl_chat/internvl/conversation.py | https://huggingface.co/THUDM/chatglm-6b/blob/1d240ba371910e9282298d4592532d7f0f3e9f3e/modeling_chatglm.py#L1302-L1308 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/conversation.py | internvl_chat/internvl/conversation.py | https://huggingface.co/THUDM/chatglm2-6b/blob/e186c891cf64310ac66ef10a87e6635fa6c2a579/modeling_chatglm.py#L926 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/conversation.py | internvl_chat/internvl/conversation.py | https://huggingface.co/internlm/internlm-chat-7b-8k/blob/bd546fa984b4b0b86958f56bf37f94aa75ab8831/modeling_internlm.py#L771 | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/internvl/dist_utils.py | internvl_chat/internvl/dist_utils.py | ://github.com/facebookresearch/detectron2/blob/main/detectron2/engine/launch.py | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/pyproject.toml | internvl_chat/pyproject.toml | https://github.com/OpenGVLab/InternVL | 注释参考链接 | -| 开源代码引入 | https://github.com/OpenGVLab/InternVL/blob/main/internvl_chat/pyproject.toml | internvl_chat/pyproject.toml | https://github.com/OpenGVLab/InternVL/issues | 注释参考链接 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/InternVL1.5/internvl_chat/internvl/model/internlm2/modeling_internlm2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/InternVL1.5/internvl_chat/internvl/model/internlm2/modeling_internlm2.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/InternVL1.5/internvl_chat/internvl/model/phi3/modeling_phi3.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/InternVL1.5/internvl_chat/internvl/model/phi3/modeling_phi3.py | https://arxiv.org/abs/1910.13461 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/LAVIS/public_address_statement.md b/PyTorch/built-in/mm/LAVIS/public_address_statement.md index bcd03fd4a9255778f21ef820a293b21267c7ae60..944e2abab479c4aa7b546001b76bed46a68a7fda 100644 --- a/PyTorch/built-in/mm/LAVIS/public_address_statement.md +++ b/PyTorch/built-in/mm/LAVIS/public_address_statement.md @@ -1,326 +1,280 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/.gitignore | LAVIS/.gitignore | https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/setup.py | LAVIS/setup.py | https://download.pytorch.org/whl/torch_stable.html | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/ckpts/download.sh | LAVIS/lavis/common/annotator/ckpts/download.sh | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/dpt_hybrid-midas-501f0c75.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/ckpts/download.sh | LAVIS/lavis/common/annotator/ckpts/download.sh | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/network-bsds500.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/ckpts/download.sh | LAVIS/lavis/common/annotator/hed/__init__.py | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/network-bsds500.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/midas/api.py | LAVIS/lavis/common/annotator/midas/api.py | https://github.com/isl-org/MiDaS | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/ckpts/download.sh | LAVIS/lavis/common/annotator/midas/api.py | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/dpt_hybrid-midas-501f0c75.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/midas/api.py | LAVIS/lavis/common/annotator/midas/api.py | https://github.com/isl-org/MiDaS/blob/master/run.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/midas/api.py | LAVIS/lavis/common/annotator/midas/api.py | https://github.com/isl-org/MiDaS/blob/master/run.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/mlsd/__init__.py | LAVIS/lavis/common/annotator/mlsd/__init__.py | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/mlsd_large_512_fp32.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/openpose/__init__.py | LAVIS/lavis/common/annotator/openpose/__init__.py | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/body_pose_model.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/openpose/__init__.py | LAVIS/lavis/common/annotator/openpose/__init__.py | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/hand_pose_model.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/openpose/util.py | LAVIS/lavis/common/annotator/openpose/util.py | https://github.com/CMU-Perceptual-Computing-Lab/openpose/blob/master/src/openpose/hand/handDetector.cpp | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/__init__.py | LAVIS/lavis/common/annotator/uniformer/__init__.py | https://huggingface.co/lllyasviel/ControlNet/resolve/main/annotator/ckpts/upernet_global_small.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/aokvqa/defaults.yaml | LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/gqa/defaults.yaml | LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_val.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/avsd/defaults_dial.yaml | LAVIS/lavis/configs/datasets/avsd/defaults_dial.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/avsd_dstc7_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/aokvqa/defaults.yaml | LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/specialized_vocab_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/avsd/defaults_dial.yaml | LAVIS/lavis/configs/datasets/avsd/defaults_dial.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/avsd_dstc7_val.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/avsd/defaults_dial.yaml | LAVIS/lavis/configs/datasets/avsd/defaults_dial.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/avsd_dstc7_test.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/gqa/defaults.yaml | LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_test.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/aokvqa/defaults.yaml | LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/specialized_vocab_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_cap.yaml | LAVIS/lavis/configs/datasets/coco/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_cap.yaml | LAVIS/lavis/configs/datasets/coco/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_cap.yaml | LAVIS/lavis/configs/datasets/coco/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_cap.yaml | LAVIS/lavis/configs/datasets/coco/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_cap.yaml | LAVIS/lavis/configs/datasets/coco/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_vqa.yaml | LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_vqa.yaml | LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_val.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_cap.yaml | LAVIS/lavis/configs/datasets/coco/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/configs/datasets/coco/defaults_vqa.yaml | LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_val_eval.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/midas/midas/midas_net_custom.py | LAVIS/lavis/common/annotator/midas/midas/midas_net.py | https://github.com/thomasjpfan/pytorch_refinenet/blob/master/pytorch_refinenet/refinenet/refinenet_4cascade.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/midas/midas/midas_net_custom.py | LAVIS/lavis/common/annotator/midas/midas/midas_net_custom.py | https://github.com/thomasjpfan/pytorch_refinenet/blob/master/pytorch_refinenet/refinenet/refinenet_4cascade.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/utils/make_divisible.py | LAVIS/lavis/common/annotator/mlsd/models/mbv2_mlsd_large.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/utils/make_divisible.py | LAVIS/lavis/common/annotator/mlsd/models/mbv2_mlsd_tiny.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/mlsd/models/mbv2_mlsd_tiny.py | LAVIS/lavis/common/annotator/mlsd/models/mbv2_mlsd_large.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/mlsd/models/mbv2_mlsd_tiny.py | LAVIS/lavis/common/annotator/mlsd/models/mbv2_mlsd_tiny.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/engine/test.py | LAVIS/lavis/common/annotator/uniformer/mmcv/engine/test.py | https://github.com/open-mmlab/mmcv/issues/985 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/io.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/image/photometric.py | LAVIS/lavis/common/annotator/uniformer/mmcv/image/photometric.py | https://dl.acm.org/doi/pdf/10.1145/3065386 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | https://github.com/CVMI-Lab/PAConv/tree/main/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | https://arxiv.org/pdf/2103.14635.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | https://github.com/CVMI-Lab/PAConv/blob/main/scene_seg/model/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/border_align.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/border_align.py | https://github.com/Megvii-BaseDetection/cvpods/blob/master/cvpods/layers/border_align.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/border_align.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/border_align.py | https://arxiv.org/abs/2007.11056 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/cc_attention.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/cc_attention.py | https://github.com/open-mmlab/mmcv/pull/1201 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/carafe.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/corner_pool.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/corner_pool.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/carafe.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/deform_conv.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/deform_conv.py | https://arxiv.org/pdf/1703.06211.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/deform_conv.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/deform_conv.py | https://github.com/open-mmlab/mmcv/issues/1440 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/fused_bias_leakyrelu.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/fused_bias_leakyrelu.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_act.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/fused_bias_leakyrelu.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/fused_bias_leakyrelu.py | http://arxiv.org/abs/1912.04958 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/fused_bias_leakyrelu.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/fused_bias_leakyrelu.py | http://arxiv.org/abs/1912.04958 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/assign_score_withk.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/knn.py | https://github.com/CVMI-Lab/PAConv/tree/main/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/nms.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/multi_scale_deform_attn.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/multi_scale_deform_attn.py | https://arxiv.org/pdf/2010.04159.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/points_in_boxes.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/points_in_boxes.py | https://github.com/open-mmlab/mmdetection3d/issues/305 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/point_sample.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/nms.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/nms.py | https://github.com/pytorch/vision/blob | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/psa_mask.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/roiaware_pool3d.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/roiaware_pool3d.py | https://arxiv.org/pdf/1907.03670.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/roiaware_pool3d.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/roipoint_pool3d.py | https://arxiv.org/pdf/1907.03670.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/saconv.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/saconv.py | https://arxiv.org/pdf/2006.02334.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/roi_align_rotated.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/roi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/roi_align_rotated.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/roi_align_rotated.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/three_interpolate.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/three_interpolate.py | https://arxiv.org/abs/1706.02413 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/three_interpolate.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/three_nn.py | https://arxiv.org/abs/1706.02413 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | sjqian@cse.cuhk.edu.hk | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | yuliu@ee.cuhk.edu.hk | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/upfirdn2d.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/upfirdn2d.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/upfirdn2d.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/voxelize.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/voxelize.py | https://arxiv.org/abs/1907.03739 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/upfirdn2d.py | LAVIS/lavis/common/annotator/uniformer/mmcv/ops/upfirdn2d.py | https://www.mathworks.com/help/signal/ref/upfirdn.html | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/iter_based_runner.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/fp16_utils.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/iter_based_runner.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/iter_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/utils/registry.py | LAVIS/lavis/common/annotator/uniformer/mmcv/utils/registry.py | https://mmcv.readthedocs.io/en/latest/understand_mmcv/registry.html | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/utils/trace.py | LAVIS/lavis/common/annotator/uniformer/mmcv/utils/trace.py | https://github.com/pytorch/pytorch/issues/42448 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/video/optflow.py | LAVIS/lavis/common/annotator/uniformer/mmcv/video/optflow.py | https://github.com/princeton-vl/RAFT/blob/224320502d66c356d88e6c712f38129e60661e80/core/utils/frame_utils.py#L102 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/datasets/builder.py | LAVIS/lavis/common/annotator/uniformer/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/context_block.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/depthwise_separable_conv_module.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/conv_ws.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/saconv.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/drop.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/drop.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/generalized_attention.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/drop.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/drop.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/generalized_attention.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/non_local.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/plugin.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/wrappers.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/flops_counter.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/sync_bn.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/sync_bn.py | https://github.com/pytorch/pytorch/issues/41081#issuecomment-783961547 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/sync_bn.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/sync_bn.py | https://github.com/pytorch/pytorch/issues/41081#issuecomment-783961547 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/transformer.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/bricks/transformer.py | https://arxiv.org/abs/2002.04745 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/init.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | LAVIS/lavis/common/annotator/uniformer/mmcv/cnn/utils/weight_init.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/init.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/evaluation.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/evaluation.py | https://github.com/open-mmlab/mmsegmentation/issues/694 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/optimizer.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | https://github.com/fastai/fastai/blob/master/fastai/callback/schedule.py#L128 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/evaluation.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/evaluation.py | https://github.com/open-mmlab/mmdetection/issues/6265 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/optimizer.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/optimizer.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/profiler.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/profiler.py | https://pytorch.org/docs/1.8.1/profiler.html#torch.profiler.profile | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/backbones/cgnet.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/backbones/hrnet.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/lraspp_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image- | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/backbones/unet.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/vit.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/uniformer.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/backbones/vit.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/backbones/resnet.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/apc_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/aspp_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/cc_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/ann_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/da_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/dm_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/dm_head.py | https://openaccess.thecvf.com/content_ICCV_2019/papers/ | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/dnl_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/fcn_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/enc_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/ema_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/fpn_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/context_block.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/cnn/bricks/generalized_attention.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/lraspp_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/point_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/psa_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/psa_head.py | https://hszhao.github.io/papers/eccv18_psanet.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/ocr_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/psp_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/sep_aspp_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/decode_heads/uper_head.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/losses/dice_loss.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/losses/dice_loss.py | https://github.com/LikeLy-Journey/SegmenTron/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/losses/lovasz_loss.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/losses/lovasz_loss.py | https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytor | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/losses/dice_loss.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/losses/dice_loss.py | https://arxiv.org/abs/1606.04797 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/necks/fpn.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/losses/lovasz_loss.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/losses/lovasz_loss.py | https://arxiv.org/abs/1705.08790 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/utils/drop.py | https://github.com/rwightman/pytorch-image- | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/models/utils/make_divisible.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/utils/weight_init.py | https://github.com/rwightman/pytorch-image- | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/common/annotator/uniformer/mmseg/models/utils/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/dvclive.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/dvclive.py | https://dvc.org/doc/dvclive | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/mlflow.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/neptune.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/neptune.py | https://docs.neptune.ai/api-reference/neptune#init | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/neptune.py | LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/neptune.py | https://docs.neptune.ai/you-should-know/logging-metadata | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmseg/core/seg/sampler/ohem_pixel_sampler.py | LAVIS/lavis/common/annotator/uniformer/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/app/calculate_coco_features.py | LAVIS/app/calculate_coco_features.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/app/multimodal_search.py | LAVIS/app/calculate_coco_features.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/app/classification.py | LAVIS/app/classification.py | https://img.atlasobscura.com/yDJ86L8Ou6aIjBsxnlAy5f164w1rjTgcHZcx2yUs4mo/rt:fit/w:1200/q:81/sm:1/scp:1/ar:1/aHR0cHM6Ly9hdGxh/cy1kZXYuczMuYW1h/em9uYXdzLmNvbS91/cGxvYWRzL3BsYWNl/X2ltYWdlcy85MDll/MDRjOS0 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/app/dataset_browser.py | LAVIS/app/dataset_browser.py | https://media.giphy.com/media/vFKqnCdLPNOKc/giphy.gif | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/app/multimodal_search.py | LAVIS/app/multimodal_search.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/assets/path2feat_coco_train2014.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/app/multimodal_search.py | LAVIS/app/multimodal_search.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/annotator/uniformer/mmcv/ops/nms.py | LAVIS/lavis/common/utils.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/utils.py | LAVIS/lavis/common/utils.py | https://github.com/facebookresearch/vissl/blob/main/vissl/utils/download.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/utils.py | LAVIS/lavis/common/utils.py | https://drive.google.com/file/d/137RyRjvTBkBiIfeYBNZBtViDHQ6_Ewsp/view | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/utils.py | LAVIS/lavis/common/utils.py | https://drive.google.com/uc?export=download&id=137RyRjvTBkBiIfeYBNZBtViDHQ6_Ewsp | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/utils.py | LAVIS/lavis/common/utils.py | https://drive.google.com/uc?export=download&id= | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/data_utils.py | LAVIS/lavis/datasets/data_utils.py | http://www.quest.dcs.shef.ac.uk/wmt16_files_mmt/validation.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/eva_vit.py | LAVIS/lavis/models/eva_vit.py | https://github.com/baaivision/EVA | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/eva_vit.py | https://github.com/rwightman/pytorch-image-models/tree/master/timm | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/eva_vit.py | LAVIS/lavis/models/eva_vit.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/eva_vit.py | LAVIS/lavis/models/eva_vit.py | https://github.com/facebookresearch/deit/ | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/eva_vit.py | LAVIS/lavis/models/eva_vit.py | https://github.com/facebookresearch/dino | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/med.py | LAVIS/lavis/models/med.py | https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/vit.py | https://github.com/rwightman/pytorch-image-models/tree/master/timm | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_vit.py | LAVIS/lavis/models/clip_vit.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/clip_vit_L.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/vit.py | https://arxiv.org/abs/2010.11929 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/vit.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/eva_vit.py | LAVIS/lavis/models/eva_vit.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/eva_vit_g.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/vit.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/models/med.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/models/med.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/runners/runner_base.py | LAVIS/lavis/runners/runner_base.py | https://github.com/salesforce/LAVIS/issues/449 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/tasks/dialogue.py | LAVIS/lavis/tasks/captioning.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val_gt.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/tasks/dialogue.py | LAVIS/lavis/tasks/captioning.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test_gt.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/tasks/dialogue.py | LAVIS/lavis/tasks/dialogue.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val_gt.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/tasks/dialogue.py | LAVIS/lavis/tasks/dialogue.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test_gt.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.py | LAVIS/projects/img2llm-vqa/img2llm_vqa.py | https://colab.research.google.com/github/anthonytmh/lavis-pnpvqa/blob/pnp_vqa/projects/pnp-vqa/pnp_vqa.ipynb | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.py | LAVIS/projects/img2llm-vqa/img2llm_vqa.py | https://colab.research.google.com/assets/colab-badge.svg | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/runners/runner_base.py | LAVIS/projects/img2llm-vqa/img2llm_vqa.py | https://github.com/salesforce/LAVIS | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.ipynb | LAVIS/projects/img2llm-vqa/img2llm_vqa.py | https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.0.0/en_core_web_sm-3.0.0.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.ipynb | LAVIS/projects/img2llm-vqa/img2llm_vqa.py | https://github.com/explosion/spacy-models/releases/download/en_core_web_sm-3.0.0/en_core_web_sm-3.0.0.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/img2llm-vqa/img2llm_vqa.ipynb | LAVIS/projects/img2llm-vqa/img2llm_vqa.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/projects/pnp-vqa/demo.png | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/tasks/vqa.py | LAVIS/lavis/tasks/vqa.py | https://github.com/allenai/aokvqa/blob/main/evaluation/eval_predictions.py#L45 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/README.md | LAVIS/projects/blip2/model_card.pdf | https://arxiv.org/abs/2301.12597 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/README.md | LAVIS/projects/blip2/model_card.pdf | https://github.com/salesforce/LAVIS/tree/main/projects/blip2 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/README.md | LAVIS/projects/blip2/model_card.pdf | https://github.com/salesforce/LAVIS/tree/main/projects/blip2 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/blip2/model_card.pdf | LAVIS/projects/blip2/model_card.pdf | https://github.com/salesforce/CodeT5/blob/main/LICENSE.txt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/blip2/model_card.pdf | LAVIS/projects/blip2/model_card.pdf | junnan.li@salesforce.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/README.md | LAVIS/projects/blip2/model_card.pdf | https://arxiv.org/abs/2301.12597 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/README.md | LAVIS/projects/blip2/model_card.pdf | https://arxiv.org/abs/2301.12597 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/blip2/model_card.pdf | LAVIS/projects/blip2/model_card.pdf | https://arxiv.org/abs/2210.11416 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/projects/blip2/model_card.pdf | LAVIS/projects/blip2/model_card.pdf | https://arxiv.org/abs/2205.01068 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/vqa_tools/vqa.py | LAVIS/lavis/common/vqa_tools/vqa.py | https://github.com/pdollar/coco/blob/master/PythonAPI/pycocotools/coco.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/common/vqa_tools/vqa_eval.py | LAVIS/lavis/common/vqa_tools/vqa_eval.py | https://github.com/tylin/coco-caption/blob/master/pycocoevalcap/eval.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/dataset_card/nlvr2.md | LAVIS/lavis/datasets/datasets/dataloader_utils.py | https://github.com/ChenRocks/UNITER | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/datasets/dataloader_utils.py | LAVIS/lavis/datasets/datasets/dataloader_utils.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/iter_based_runner.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_coco.py | LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/train2014.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_coco.py | LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/val2014.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_coco.py | LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/test2014.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_coco.py | LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/test2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_didemo.py | LAVIS/lavis/datasets/download_scripts/download_didemo.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/didemo/didemo_videos.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_flickr.py | LAVIS/lavis/datasets/download_scripts/download_flickr.py | https://www.kaggle.com/datasets/hsankesara/flickr-image-dataset | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_gqa.py | LAVIS/lavis/datasets/download_scripts/download_gqa.py | https://downloads.cs.stanford.edu/nlp/data/gqa/images.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_flickr.py | LAVIS/lavis/datasets/download_scripts/download_flickr.py | https://www.kaggle.com/docs/api | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msrvtt.py | LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://www.mediafire.com/file/czh8sezbo9s4692/test_videos.zip/file | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msrvtt.py | LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://www.mediafire.com/file/x3rrbe4hwp04e6w/train_val_videos.zip/file | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msrvtt.py | LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://download1602.mediafire.com/xxxxxxxxxxxx/x3rrbe4hwp04e6w/train_val_videos.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msrvtt.py | LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://download2390.mediafire.com/xxxxxxxxxxxx/czh8sezbo9s4692/test_videos.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msrvtt.py | LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://download2295.mediafire.com/4bb7p74xrbgg/x3rrbe4hwp04e6w/train_val_videos.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msrvtt.py | LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://download2390.mediafire.com/79hfq3592lqg/czh8sezbo9s4692/test_videos.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_msvd.py | LAVIS/lavis/datasets/download_scripts/download_msvd.py | https://www.cs.utexas.edu/users/ml/clamp/videoDescription/YouTubeClips.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_sbu.py | LAVIS/lavis/datasets/download_scripts/download_sbu.py | http://www.cs.rice.edu/~vo9/sbucaptions/sbu_images.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_nocaps.py | LAVIS/lavis/datasets/download_scripts/download_nocaps.py | https://nocaps.s3.amazonaws.com/nocaps_val_4500_captions.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_nocaps.py | LAVIS/lavis/datasets/download_scripts/download_nocaps.py | https://s3.amazonaws.com/nocaps/nocaps_test_image_info.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_vg.py | LAVIS/lavis/datasets/download_scripts/download_vg.py | https://cs.stanford.edu/people/rak248/VG_100K_2/images.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/datasets/download_scripts/download_vg.py | LAVIS/lavis/datasets/download_scripts/download_vg.py | https://cs.stanford.edu/people/rak248/VG_100K_2/images2.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/blip2_vicuna_instruct.py | LAVIS/lavis/models/blip2_models/blip2_t5_instruct.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/blip2_vicuna_instruct.py | LAVIS/lavis/models/blip2_models/blip2_vicuna_instruct.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/med.py | LAVIS/lavis/models/blip2_models/Qformer.py | https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_opt.py | https://huggingface.co/models?filter=opt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://medium.com/huggingface/from-tensorflow-to-pytorch-265f40ef2a28 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_opt.py | https://github.com/huggingface/transformers/pull/17437 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_opt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_opt.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_opt.py | https://github.com/facebookresearch/metaseq/pull/164 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_opt.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/models/blip2_models/Qformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/models/blip2_models/Qformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L1624 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/transformer_layers.py#L56 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/attention.py#L136 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_diffusion_models/modeling_ctx_clip.py | LAVIS/lavis/models/blip_diffusion_models/modeling_ctx_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_diffusion_models/ptp_utils.py | LAVIS/lavis/models/blip_diffusion_models/ptp_utils.py | https://github.com/google/prompt-to-prompt/blob/main/ptp_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_opt.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip2_models/modeling_t5.py | LAVIS/lavis/models/blip2_models/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/clip_outputs.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/model.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_diffusion_models/modeling_ctx_clip.py | LAVIS/lavis/models/clip_models/model.py | https://github.com/openai/CLIP | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn50-quickgelu-yfcc15m-455df137.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn50-quickgelu-cc12m-f000538c.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn50-quickgelu-yfcc15m-455df137.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn50-quickgelu-cc12m-f000538c.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn101-quickgelu-yfcc15m-3e04b30e.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn101-quickgelu-yfcc15m-3e04b30e.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/timm_model.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/clip_models/timm_model.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/7e526bd135e493cef0776de27d5f42653e6b4c8bf9e0f653bb11773263205fdd/RN50x4.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/52378b407f34354e150460fe41077663dd5b39c54cd0bfd2b27167a4a06ec9aa/RN50x16.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/be1cfb55d75a9666199fb2206c106743da0f6468c9d327f3e0d0a543a9919d9c/RN50x64.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_e31-d867053b.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_e32-46683a32.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_avg-8a00ab3c.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_e31-d867053b.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_e32-46683a32.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_avg-8a00ab3c.pt | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/5806e77cd80f8b59890b7e101eabd078d9fb84e6937f9e85e4ecb61988df416f/ViT-B-16.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/pretrained.py | LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/3035c92b350959924f9f00213499208652fc7ea050643e8b385c2dac08641f02/ViT-L-14-336px.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/vit.py | LAVIS/lavis/models/clip_models/timm_model.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/models/clip_models/timm_model.py | https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/tokenizer.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_diffusion_models/modeling_ctx_clip.py | LAVIS/lavis/models/clip_models/tokenizer.py | https://github.com/openai/CLIP | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/models/clip_models/timm_model.py | https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/models/blip_models/nlvr_encoder.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/blip_models/nlvr_encoder.py | LAVIS/lavis/models/blip_models/nlvr_encoder.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/model.py | LAVIS/lavis/models/clip_models/model.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/transform.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/models/clip_models/timm_model.py | https://github.com/lucidrains/vit-pytorch/blob/6f3a5fcf0bca1c5ec33a35ef48d97213709df4ba/vit_pytorch/rvt.py | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/models/clip_models/timm_model.py | https://blog.eleuther.ai/rotary-embeddings/ | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/timm_model.py | LAVIS/lavis/models/clip_models/timm_model.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/utils.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/utils.py | LAVIS/lavis/models/clip_models/utils.py | https://github.com/pytorch/pytorch/blob/a5895f85be0f10212791145bfedc0261d364f103/torch/nn/modules/batchnorm.py#L762 | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/clip_outputs.py | LAVIS/lavis/models/clip_models/__init__.py | https://github.com/mlfoundations/open_clip | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/__init__.py | LAVIS/lavis/models/clip_models/__init__.py | https://github.com/mlfoundations/open_clip | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/__init__.py | LAVIS/lavis/models/clip_models/__init__.py | https://github.com/openai/CLIP | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/pnp_vqa_models/pnp_unifiedqav2_fid.py | LAVIS/lavis/models/pnp_vqa_models/pnp_unifiedqav2_fid.py | https://github.com/facebookresearch/FiD | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/clip_models/model.py | LAVIS/lavis/models/clip_models/model.py | https://discuss.pytorch.org/t/valueerror-attemting-to-unscale-fp16-gradients/81372 | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/conv2d_same.py | LAVIS/lavis/models/timesformer/conv2d_same.py | https://github.com/facebookresearch/TimeSformer | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/conv2d_same.py | LAVIS/lavis/models/timesformer/features.py | https://github.com/facebookresearch/TimeSformer | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/conv2d_same.py | LAVIS/lavis/models/timesformer/helpers.py | https://github.com/facebookresearch/TimeSformer | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/img2prompt_models/img2prompt_vqa.py | LAVIS/lavis/models/img2prompt_models/img2prompt_vqa.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/projects/img2prompt/T5_large_QG.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/conv2d_same.py | LAVIS/lavis/models/timesformer/vit.py | https://github.com/facebookresearch/TimeSformer | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/vit.py | LAVIS/lavis/models/timesformer/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/conv2d_same.py | LAVIS/lavis/models/timesformer/vit_utils.py | https://github.com/facebookresearch/TimeSformer | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/conv2d_same.py | LAVIS/lavis/models/timesformer/__init__.py | https://github.com/facebookresearch/TimeSformer | 源码实现 | -| 开源代码引入 | https://github.com/salesforce/LAVIS/blob/main/lavis/models/timesformer/vit_utils.py | LAVIS/lavis/models/timesformer/vit_utils.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/__init__.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/calculate_coco_features.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/calculate_coco_features.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/demo.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/classification.py | https://img.atlasobscura.com/yDJ86L8Ou6aIjBsxnlAy5f164w1rjTgcHZcx2yUs4mo/rt:fit/w:1200/q:81/sm:1/scp:1/ar:1/aHR0cHM6Ly9hdGxh/cy1kZXYuczMuYW1h/em9uYXdzLmNvbS91/cGxvYWRzL3BsYWNl/X2ltYWdlcy85MDll/MDRjOS00NTJjLTQx/NzQtYTY4MS02NmQw/MzI2YWIzNjk1ZGVk/MGZhMTJiMTM5MmZi/NGFfUmVhcl92aWV3/X29mX3RoZV9NZXJs/aW9uX3N0YXR1ZV9h/dF9NZXJsaW9uX1Bh/cmssX1NpbmdhcG9y/ZSxfd2l0aF9NYXJp/bmFfQmF5X1NhbmRz/X2luX3RoZV9kaXN0/YW5jZV8tXzIwMTQw/MzA3LmpwZw.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/dataset_browser.py | https://media.giphy.com/media/vFKqnCdLPNOKc/giphy.gif | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/multimodal_search.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/assets/path2feat_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/app/multimodal_search.py | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/mlsd/models/mbv2_mlsd_large.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/mlsd/models/mbv2_mlsd_tiny.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_batch256_imagenet_20210208-da620c4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_batch256_imagenet_20210208-e6920e4a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_batch256_imagenet_20210208-4d1d6080.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_batch256_imagenet_20210207-f244902c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_batch256_imagenet_20210208-4271cd6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_b32x8_imagenet_20210506-23a247d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmdetection/v2.0/third_party/mobilenet_v2_batch256_imagenet-ff34753d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/ops/upfirdn2d.py | https://www.mathworks.com/help/signal/ref/upfirdn.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/runner/hooks/profiler.py | https://pytorch.org/docs/1.8.1/profiler.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/common/annotator/uniformer/mmcv/utils/registry.py | https://mmcv.readthedocs.io/en/latest/understand_mmcv/registry.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/specialized_vocab_train.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/specialized_vocab_train.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_val.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_train.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/aokvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_test.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/avsd/defaults_dial.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/avsd_dstc7_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/avsd/defaults_dial.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/avsd_dstc7_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/avsd/defaults_dial.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/avsd_dstc7_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_val_eval.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_val.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_train.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_test.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/v2_OpenEnded_mscoco_val2014_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/v2_mscoco_val2014_annotations.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/answer_list.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/answer_list.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/eval_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/vqa_val_eval.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/eval_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/v2_OpenEnded_mscoco_val2014_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/eval_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/v2_mscoco_val2014_annotations.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/coco/eval_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vqav2/answer_list.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/didemo/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/didemo/retrieval_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/didemo/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/didemo/retrieval_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/didemo/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/didemo/retrieval_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/flickr30k/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/flickr30k_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/flickr30k/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/flickr30k_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/flickr30k/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/flickr30k_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/balanced_testdev.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/gqa/testdev_balanced_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/balanced_testdev.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/gqa/train_balanced_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/balanced_testdev.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/gqa/test_balanced_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/balanced_val.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/gqa/val_balanced_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/balanced_val.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/gqa/train_balanced_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/balanced_val.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/gqa/test_balanced_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/large_vocab_train_lavis.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/large_vocab_train_lavis.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_val.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/gqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/aokvqa/aokvqa_v1p0_test.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/cap_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/cap_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/cap_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/train_ans2label.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/qa_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/qa_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/qa_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/retrieval_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/retrieval_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msrvtt/defaults_ret.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msrvtt/retrieval_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/cap_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/cap_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/cap_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/train_ans2label.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/qa_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/qa_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/msvd/defaults_qa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/msvd/qa_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/nlvr/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/nlvr/nlvr_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/nlvr/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/nlvr/nlvr_dev.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/nlvr/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/nlvr/nlvr_dev.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/nocaps/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/nocaps_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/nocaps/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/datasets/nocaps_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/okvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/okvqa/OpenEnded_mscoco_val2014_questions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/okvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/okvqa/okvqa_val_eval.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/okvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/okvqa/okvqa_train.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/okvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/okvqa/okvqa_answer_list_train.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/okvqa/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/okvqa/mscoco_val2014_annotations.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/sbu_caption/defaults.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/sbu/sbu.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/vatex/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vatex/cap_val.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/vatex/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vatex/cap_train.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/vatex/defaults_cap.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/vatex/cap_private_test.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/vg/defaults_caption.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/visual_genome/vg_caption.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/datasets/vg/defaults_vqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/datasets/visual_genome/vg_qa.json | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_classification_ve.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/ALBEF.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_classification_ve.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALBEF/albef_snli_ve_lavis.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_feature_extractor.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/ALBEF.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_nlvr.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/pretrain_model_nlvr.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_nlvr.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALBEF/albef_nlvr_lavis.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_pretrain_base.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/ALBEF.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_retrieval_coco.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/ALBEF.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_retrieval_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALBEF/albef_coco_retrieval_lavis.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_retrieval_flickr.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/ALBEF.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_retrieval_flickr.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALBEF/albef_flickr_retrieval_lavis.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_vqav2.yaml | https://storage.googleapis.com/sfr-pcl-data-research/ALBEF/ALBEF.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/albef_vqav2.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALBEF/albef_vqav2_lavis.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/alpro_qa_msrvtt.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALPRO/alpro_pretrain.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/alpro_qa_msrvtt.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALPRO/alpro_msrvtt_qa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/alpro_qa_msvd.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALPRO/alpro_pretrain.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/alpro_qa_msvd.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALPRO/alpro_msvd_qa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/alpro_retrieval_msrvtt.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALPRO/alpro_pretrain.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/alpro_retrieval_msrvtt.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/ALPRO/alpro_msrvtt_retrieval.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_caption_base_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_caption_base_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP/blip_coco_caption_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_caption_large_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_caption_large_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_classification_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_caption_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_feature_extractor_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_itm_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_retrieval_coco.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_itm_large.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_retrieval_coco.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_nlvr.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_nlvr.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_nlvr.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_pretrain_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_retrieval_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_retrieval_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP/blip_coco_retrieval.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_retrieval_flickr.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_retrieval_flickr.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP/blip_flickr_retrieval.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_vqa_aokvqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_vqa_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_vqa_aokvqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP/blip_aokvqa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_vqa_okvqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_vqa_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_vqa_okvqa.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP/blip_okvqa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_vqav2.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip_vqav2.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_base_vqa_capfilt_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_caption_flant5xl.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xl.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_caption_flant5xl.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_caption_flant5xl.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_caption_opt2.7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_opt2.7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_caption_opt2.7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_caption_opt2.7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_caption_opt6.7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_opt6.7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_caption_opt6.7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_caption_opt6.7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_coco.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_finetune_coco.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_instruct_flant5xl.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/InstructBLIP/instruct_blip_flanxl_trimmed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_instruct_flant5xxl.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/InstructBLIP/instruct_blip_flanxxl_trimmed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_instruct_vicuna13b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/InstructBLIP/instruct_blip_vicuna13b_trimmed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_flant5xl.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xl.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_flant5xl_vitL.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xl_vitL.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_flant5xxl.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xxl.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_llama7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_opt2.7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_opt2.7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_opt6.7b.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_opt6.7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip2/blip2_pretrain_vitL.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_vitL.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip-diffusion/blip_diffusion_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP-Diffusion/blip-diffusion.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip-diffusion/blip_diffusion_controlnet_canny.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP-Diffusion/blip-diffusion.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip-diffusion/blip_diffusion_controlnet_depth.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP-Diffusion/blip-diffusion-openimage.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/blip-diffusion/blip_diffusion_controlnet_hed.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP-Diffusion/blip-diffusion-openimage.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/img2prompt-vqa/img2prompt_vqa_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/projects/img2prompt/T5_large_QG.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/img2prompt-vqa/img2prompt_vqa_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_retrieval_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/img2prompt-vqa/img2prompt_vqa_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/pnp-vqa/pnp_vqa_3b.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_retrieval_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/pnp-vqa/pnp_vqa_3b.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/pnp-vqa/pnp_vqa_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_retrieval_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/pnp-vqa/pnp_vqa_base.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/pnp-vqa/pnp_vqa_large.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_retrieval_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/configs/models/pnp-vqa/pnp_vqa_large.yaml | https://storage.googleapis.com/sfr-vision-language-research/BLIP/models/model_large_caption_coco_train2014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/val2014.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/train2014.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/test2015.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_coco.py | http://images.cocodataset.org/zips/test2014.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_flickr.py | https://www.kaggle.com/datasets/hsankesara/flickr-image-dataset | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_gqa.py | https://downloads.cs.stanford.edu/nlp/data/gqa/images.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://download2295.mediafire.com/4bb7p74xrbgg/x3rrbe4hwp04e6w/train_val_videos.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_msrvtt.py | https://download2390.mediafire.com/79hfq3592lqg/czh8sezbo9s4692/test_videos.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_msvd.py | https://www.cs.utexas.edu/users/ml/clamp/videoDescription/YouTubeClips.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_nocaps.py | https://nocaps.s3.amazonaws.com/nocaps_val_4500_captions.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_nocaps.py | https://s3.amazonaws.com/nocaps/nocaps_test_image_info.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_vg.py | https://cs.stanford.edu/people/rak248/VG_100K_2/images2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/datasets/download_scripts/download_vg.py | https://cs.stanford.edu/people/rak248/VG_100K_2/images.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_opt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_opt.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_t5.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/blip2_models/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/be1cfb55d75a9666199fb2206c106743da0f6468c9d327f3e0d0a543a9919d9c/RN50x64.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/7e526bd135e493cef0776de27d5f42653e6b4c8bf9e0f653bb11773263205fdd/RN50x4.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/5806e77cd80f8b59890b7e101eabd078d9fb84e6937f9e85e4ecb61988df416f/ViT-B-16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/52378b407f34354e150460fe41077663dd5b39c54cd0bfd2b27167a4a06ec9aa/RN50x16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_models/pretrained.py | https://openaipublic.azureedge.net/clip/models/3035c92b350959924f9f00213499208652fc7ea050643e8b385c2dac08641f02/ViT-L-14-336px.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/clip_vit.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/clip_vit_L.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/eva_vit.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/eva_vit_g.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/img2prompt_models/img2prompt_vqa.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/projects/img2prompt/T5_large_QG.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/models/vit.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/projects/blip2/train/pretrain_stage2.yaml | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/tasks/captioning.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val_gt.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/tasks/captioning.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test_gt.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/tasks/dialogue.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_val_gt.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/lavis/tasks/dialogue.py | https://storage.googleapis.com/sfr-vision-language-research/datasets/coco_karpathy_test_gt.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LAVIS/setup.py | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/LLaVA/public_address_statement.md b/PyTorch/built-in/mm/LLaVA/public_address_statement.md index 84be5c6eaffbf1062f07121b174809529475dbf8..775ec38dbb3125eeaf561ac305ecbac736f2bdc9 100644 --- a/PyTorch/built-in/mm/LLaVA/public_address_statement.md +++ b/PyTorch/built-in/mm/LLaVA/public_address_statement.md @@ -1,16 +1,16 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |--------------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------|------------------------|-----------------------------| -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/eval/m4c_evaluator.py | llava/eval/m4c_evaluator.py | https://github.com/facebookresearch/mmf/blob/c46b3b3391275b4181567db80943473a89ab98ab/pythia/tasks/processors.py#L897 | 代码实现参考连接 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/eval/m4c_evaluator.py | llava/eval/m4c_evaluator.py | https://github.com/ronghanghu/coco-caption | 数据集下载 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/eval/m4c_evaluator.py | llava/eval/m4c_evaluator.py | https://github.com/tylin/coco-caption | 数据集下载 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/model/multimodal_encoder/clip_encoder.py | llava/model/multimodal_encoder/clip_encoder.py | https://github.com/bfshi/scaling_on_scales.git | s2wrapper包安装地址 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/serve/gradio_web_server.py
https://github.com/haotian-liu/LLaVA/blob/main/llava/model/builder.py
https://github.com/haotian-liu/LLaVA/blob/main/docs/macOS.md| llava/serve/gradio_web_server.py
llava/model/builder.py
docs/macOS.md | https://github.com/haotian-liu/LLaVA | llava开源地址 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/train/llama_flash_attn_monkey_patch.py | llava/train/llama_flash_attn_monkey_patch.py | https://github.com/HazyResearch/flash-attention | flashattention官方实现 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/utils.py | llava/utils.py | https://api.openai.com/v1/moderations. | OpenAI moderation API | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/train/train.py | llava/train/train.py | https://github.com/lm-sys/FastChat | FastChat github链接 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/train/llama_xformers_attn_monkey_patch.py | llava/train/llama_xformers_attn_monkey_patch.py | https://raw.githubusercontent.com/oobabooga/text-generation-webui/main/modules/llama_attn_hijack.py | text-generation-webui项目连接 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/llava/serve/gradio_web_server.py
https://github.com/haotian-liu/LLaVA/blob/main/predict.py | llava/serve/gradio_web_server.py
predict.py | https://github.com/facebookresearch/llama/blob/main/MODEL_CARD.md | llama模型readme | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/docs/Windows.md | docs/Windows.md | https://learn.microsoft.com/en-us/windows/wsl/install | wsl2安装地址 | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/docs/ScienceQA.md | docs/ScienceQA.md | https://github.com/lupantech/ScienceQA | sqa | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/docs/MODEL_ZOO.md | docs/MODEL_ZOO.md | https://api.wandb.ai/links/lht/6orh56wc | wandb | -| 开源代码引入 | https://github.com/haotian-liu/LLaVA/blob/main/docs/MODEL_ZOO.md | docs/MODEL_ZOO.md | https://huggingface.co/liuhaotian/ | 模型权重地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------|---------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://openai.com | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://maxcdn.bootstrapcdn.com/bootstrap/4.5.2/js/bootstrap.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://code.jquery.com/jquery-3.5.1.slim.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://cdn.jsdelivr.net/npm/marked@4.3.0/lib/marked.umd.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://cdn.jsdelivr.net/npm/@popperjs/core@2.11.6/dist/umd/popper.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://maxcdn.bootstrapcdn.com/bootstrap/4.5.2/css/bootstrap.min.css | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://fonts.googleapis.com/icon?family=Material+Icons | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://vicuna.lmsys.org | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/eval/webpage/index.html | https://chat.lmsys.org/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/serve/gradio_web_server.py | https://github.com/haotian-liu/LLaVA | 源码地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/serve/gradio_web_server.py | https://github.com/haotian-liu/LLaVA/blob/main/docs/MODEL_ZOO.md | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/serve/gradio_web_server.py | https://arxiv.org/abs/2304.08485 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/serve/gradio_web_server.py | https://llava-vl.github.io/blog/2024-01-30-llava-1-6/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/LLaVA/llava/utils.py | https://api.openai.com/v1/moderations | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/Magvit2/public_address_statement.md b/PyTorch/built-in/mm/Magvit2/public_address_statement.md index 6b7f15b278b3418078b082a300cbbe74ab26259e..c4e2205cae5ecd3c41a5af75dd7020d457b5b5cc 100644 --- a/PyTorch/built-in/mm/Magvit2/public_address_statement.md +++ b/PyTorch/built-in/mm/Magvit2/public_address_statement.md @@ -1,9 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |-----------------------------------------------------------------------|-----------------------------------------|------------------------|----------| -| 开源代码引入 | https://github.com/lucidrains/magvit2-pytorch/blob/main/setup.py | .\setup.py | lucidrains@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/lucidrains/magvit2-pytorch/blob/main/.gitignore | .\.gitignore | https://python-poetry.org/docs/basic-usage/#commit-your-poetrylock-file-to-version-control | 文档地址 | -| 开源代码引入 | https://github.com/lucidrains/magvit2-pytorch/blob/main/.gitignore | .\.gitignore | https://pdm.fming.dev/#use-with-ide | 文档地址 | -| 开源代码引入 | https://github.com/lucidrains/magvit2-pytorch/blob/main/.gitignore | .\.gitignore | https://github.com/github/gitignore/blob/main/Global/JetBrains.gitignore | 文档地址 | -| 开源代码引入 | https://github.com/lucidrains/magvit2-pytorch/blob/main/magvit2_pytorch/magvit2_pytorch.py | .\magvit2_pytorch\magvit2_pytorch.py | https://arxiv.org/abs/2012.13375 | 参考论文地址 | -| 开源代码引入 | https://github.com/lucidrains/magvit2-pytorch/blob/main/magvit2_pytorch/magvit2_pytorch.py | .\magvit2_pytorch\magvit2_pytorch.py | https://arxiv.org/abs/2106.09681 | 参考论文地址 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------|----------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/Magvit2/setup.py | lucidrains@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/MiniCPM-V/public_address_statement.md b/PyTorch/built-in/mm/MiniCPM-V/public_address_statement.md index 395e94af3e6f9b259d86b480750f92c3fa84eade..43bbcf5fe7d54f9c369679bf61e8927823254078 100644 --- a/PyTorch/built-in/mm/MiniCPM-V/public_address_statement.md +++ b/PyTorch/built-in/mm/MiniCPM-V/public_address_statement.md @@ -1,14 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------------------------------------------------------ | ------------ | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/evaluate/vqa_eval.py | MiniCPM-V/eval_mm/vlmevalkit/vlmeval/evaluate/vqa_eval.py | https://github.com/GT-Vision-Lab/VQA | 模型相关说明 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/evaluate/vqa_eval.py | MiniCPM-V/eval_mm/vlmevalkit/vlmeval/evaluate/vqa_eval.py | https://github.com/google-research/pix2struct/blob/main/pix2struct/metrics.py#L81 | 源码实现 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/evaluate/vqa_eval.py | MiniCPM-V/eval_mm/vlmevalkit/vlmeval/evaluate/vqa_eval.py | https://arxiv.org/pdf/2203.10244.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/api/gpt.py | MiniCPM-V/eval_mm/vlmevalkit/vlmeval/api/gpt.py | https://api.openai.com/v1/chat/completions | 模型相关说明 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/vlmeval/api/gpt_int.py | MiniCPM-V/eval_mm/vlmevalkit/vlmeval/api/gpt_int.py | http://ecs.sv.us.alles-apin.openxlab.org.cn/v1/openai/v2/text/chat | 模型相关说明 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/eval_mm/vlmevalkit/script/run_inference.sh | MiniCPM-V/eval_mm/vlmevalkit/script/run_inference.sh | https://hf-mirror.com | 模型相关说明 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/omnilmm/utils.py | MiniCPM-V/omnilmm/utils.py | https://api.openai.com/v1/moderations | 模型相关说明 | -| 开源代码引入 | https://github.com/OpenBMB/MiniCPM-V/blob/main/omnilmm/model/resampler.py | MiniCPM-V/omnilmm/model/resampler.py | https://github.com/facebookresearch/mae/blob/efb2a8062c206524e35e47d04501ed4f544c0ae8/util/pos_embed.py#L20 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.44.0/src/transformers/models/idefics2/modeling_idefics2.py | MiniCPM-V/npu_patch/idefics2_conv_monkey_patch.py | https://arxiv.org/abs/2307.06304 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.44.0/src/transformers/models/llama/modeling_llama.py | MiniCPM-V/npu_patch/llama_flash_attn_monkey_patch.py | https://github.com/Dao-AILab/flash-attention/releases/tag/v2.1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.44.0/src/transformers/models/idefics2/modeling_idefics2.py | MiniCPM-V/npu_patch/idefics2_flash_attn_monkey_patch.py | https://github.com/Dao-AILab/flash-attention/releases/tag/v2.1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.44.0/src/transformers/models/llama/modeling_llama.py | MiniCPM-V/npu_patch/llama_rope_monkey_patch.py | https://github.com/huggingface/transformers/pull/29285 | 模型相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/eval_mm/vlmevalkit/script/run_inference.sh | https://hf-mirror.com | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/eval_mm/vlmevalkit/vlmeval/api/gpt.py | https://api.openai.com/v1/chat/completions | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/eval_mm/vlmevalkit/vlmeval/api/gpt_int.py | http://ecs.sv.us.alles-apin.openxlab.org.cn/v1/openai/v2/text/chat | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/omnilmm/utils.py | https://api.openai.com/v1/moderations | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/web_demo_2.6.py | http://thunlp.oss-cn-qingdao.aliyuncs.com/multi_modal/never_delete/m_bear2.gif | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/web_demo_2.6.py | http://thunlp.oss-cn-qingdao.aliyuncs.com/multi_modal/never_delete/fshot.gif | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniCPM-V/web_demo_2.6.py | http://thunlp.oss-cn-qingdao.aliyuncs.com/multi_modal/never_delete/video2.gif | 图片示例 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/MiniGPT-4/public_address_statement.md b/PyTorch/built-in/mm/MiniGPT-4/public_address_statement.md index ac224d1acc1ffe2dd16be1d09a75cb99e9516acd..676c498b5e57d678be2382e4649cb8f1f5bd5c70 100644 --- a/PyTorch/built-in/mm/MiniGPT-4/public_address_statement.md +++ b/PyTorch/built-in/mm/MiniGPT-4/public_address_statement.md @@ -1,37 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/README.md | MiniGPT-4/demo.py | https://minigpt-4.github.io | 参考论文地址 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/README.md | MiniGPT-4/demo.py | https://img.shields.io/badge/Project-Page-Green | 参考论文地址 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/README.md | MiniGPT-4/demo.py | https://github.com/Vision-CAIR/MiniGPT-4 | 参考论文地址 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/demo.py | MiniGPT-4/demo.py | https://img.shields.io/badge/Github-Code-blue | 参考论文地址 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/demo.py | MiniGPT-4/demo.py | https://raw.githubusercontent.com/Vision-CAIR/MiniGPT-4/main/MiniGPT_4.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/demo.py | MiniGPT-4/demo.py | https://img.shields.io/badge/Paper-PDF-red | 参考论文地址 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/transformers_modify/utils.py | https://github.com/huggingface/transformers/pull/21405 | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/transformers_modify/utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/transformers_modify/utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/transformers_modify/utils.py | https://arxiv.org/abs/2010.00904 | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/README.md | MiniGPT-4/transformers_modify/utils.py | https://huggingface.co/docs/transformers/main_classes/text_generation | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/README.md | MiniGPT-4/transformers_modify/utils.py | https://huggingface.co/docs/transformers/main/en/main_classes/text_generation | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/transformers_modify/utils.py | https://github.com/huggingface/transformers/pull/5420#discussion_r449779867 | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/transformers_modify/utils.py | http://arxiv.org/abs/1904.09751 | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/common/utils.py|MiniGPT-4/minigpt4/common/utils.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/common/utils.py|MiniGPT-4/minigpt4/common/utils.py | https://github.com/facebookresearch/vissl/blob/main/vissl/utils/download.py | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/common/utils.py|MiniGPT-4/minigpt4/common/utils.py | https://drive.google.com/file/d/137RyRjvTBkBiIfeYBNZBtViDHQ6_Ewsp/view | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/common/utils.py|MiniGPT-4/minigpt4/common/utils.py | https://drive.google.com/uc?export=download&id=137RyRjvTBkBiIfeYBNZBtViDHQ6_Ewsp | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/common/utils.py|MiniGPT-4/minigpt4/common/utils.py | https://drive.google.com/uc?export=download&id= | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/eva_vit.py|MiniGPT-4/minigpt4/models/eva_vit.py | https://github.com/baaivision/EVA | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/eva_vit.py|MiniGPT-4/minigpt4/models/eva_vit.py | https://github.com/rwightman/pytorch-image-models/tree/master/timm | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/eva_vit.py|MiniGPT-4/minigpt4/models/eva_vit.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/eva_vit.py|MiniGPT-4/minigpt4/models/eva_vit.py | https://github.com/facebookresearch/deit/ | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/eva_vit.py|MiniGPT-4/minigpt4/models/eva_vit.py | https://github.com/facebookresearch/dino | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/minigpt4.py|MiniGPT-4/minigpt4/models/mini_gpt4.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xxl.pth | 预训练模型 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/minigpt4/models/modeling_llama.py | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py|MiniGPT-4/minigpt4/models/Qformer.py | https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/minigpt4.py|MiniGPT-4/minigpt4/models/mini_gpt4.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xxl.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/losses/cross_entropy_loss.py | MiniGPT-4/minigpt4/models/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/eva_vit.py|MiniGPT-4/minigpt4/models/eva_vit.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/eva_vit_g.pth | 预训练模型 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py | MiniGPT-4/minigpt4/models/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py|MiniGPT-4/minigpt4/models/Qformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/models/Qformer.py|MiniGPT-4/minigpt4/models/Qformer.py | https://arxiv.org/abs/1706.03762 | 模型相关说明 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/datasets/datasets/dataloader_utils.py|MiniGPT-4/minigpt4/datasets/datasets/dataloader_utils.py | https://github.com/ChenRocks/UNITER | 源码实现 | -| 开源代码引入 | https://github.com/Vision-CAIR/MiniGPT-4/blob/master/minigpt4/datasets/datasets/dataloader_utils.py|MiniGPT-4/minigpt4/datasets/datasets/dataloader_utils.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/iter_based_runner.py | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniGPT-4/minigpt4/models/eva_vit.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/eva_vit_g.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniGPT-4/minigpt4/models/mini_gpt4.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xxl.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniGPT-4/minigpt4/models/mini_gpt4.py | https://storage.googleapis.com/sfr-vision-language-research/LAVIS/models/BLIP2/blip2_pretrained_flant5xxl.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniGPT-4/minigpt4/models/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/MiniGPT-4/minigpt4/models/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/public_address_statement.md b/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/public_address_statement.md index 2a188e251f7d1e2d3f9671ea79c50d8a01edb8e8..f09b668c3d93fad41dd90f6d43df7e87ec2cdb7c 100644 --- a/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/public_address_statement.md @@ -1,54 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ | ------------------------------------------------------------------------------- |--------------------------| ------------------------------------------------- |----------------------------| -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/setup.py | ./setup.py | https://github.com/mlfoundations/open_clip | setuptools的setup函数的url参数入参 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn50-quickgelu-yfcc15m-455df137.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn50-quickgelu-cc12m-f000538c.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/rn101-quickgelu-yfcc15m-3e04b30e.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/7e526bd135e493cef0776de27d5f42653e6b4c8bf9e0f653bb11773263205fdd/RN50x4.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/52378b407f34354e150460fe41077663dd5b39c54cd0bfd2b27167a4a06ec9aa/RN50x16.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/be1cfb55d75a9666199fb2206c106743da0f6468c9d327f3e0d0a543a9919d9c/RN50x64.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_e31-d867053b.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-quickgelu-laion400m_e32-46683a32.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_32-laion2b_e16-af8dbd0c.pth | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/5806e77cd80f8b59890b7e101eabd078d9fb84e6937f9e85e4ecb61988df416f/ViT-B-16.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_16-laion400m_e31-00efa78f.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_16-laion400m_e32-55e67d44.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_16_plus_240-laion400m_e31-8fb26589.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_b_16_plus_240-laion400m_e32-699c4b84.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_l_14-laion400m_e31-69988bb6.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://github.com/mlfoundations/open_clip/releases/download/v0.2-weights/vit_l_14-laion400m_e32-3d133497.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/pretrained.py | ./src/open_clip/setup.py | https://openaipublic.azureedge.net/clip/models/3035c92b350959924f9f00213499208652fc7ea050643e8b385c2dac08641f02/ViT-L-14-336px.pt | 开源预训练模型下载链接 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/master/tests/util_test.py | ./tests/util_test.py | https://github.com/mlfoundations/open_clip/issues/219 | open_clip源码地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/master/tests/test_inference.py | ./tests/test_inference.py | https://github.com/mlfoundations/open_clip/issues/219 | open_clip源码地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/master/tests/test_inference.py | ./tests/test_inference.py | https://github.com/pytorch/pytorch/issues/92073 | pytorch公网地址来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/training/params.py | ./src/training/params.py | https://arxiv.org/pdf/2103.00020.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/training/main.py | ./src/training/main.py | http://www.codinghorror.com/blog/archives/001018.html | codinghorror地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/training/main.py | ./src/training/main.py | https://arxiv.org/abs/2111.07991 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/training/data.py | ./src/training/data.py | https://github.com/webdataset/webdataset/issues/169 | webdataset地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/zero_shot_metadata.py | ./src/open_clip/zero_shot_metadata.py | https://github.com/openai/CLIP/blob/main/notebooks/Prompt_Engineering_for_ImageNet.ipynb | openai/CLIP地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/utils.py | ./src/open_clip/utils.py | https://github.com/pytorch/pytorch/blob/a5895f85be0f10212791145bfedc0261d364f103/torch/nn/modules/batchnorm.py#L762 | pytorch.nn.modules公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/transformer.py | ./src/open_clip/transformer.py | https://arxiv.org/abs/2212.00794 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/transformer.py | ./src/open_clip/transformer.py | https://github.com/pytorch/pytorch/issues/79887#issuecomment-1161758372 | pytorch/issues公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/transformer.py | ./src/open_clip/transformer.py | https://arxiv.org/abs/2302.01327v1 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/transformer.py | ./src/open_clip/transformer.py | https://github.com/pytorch/pytorch/issues/79887#issuecomment-1161758372 | pytorch/issues公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/tokenizer.py | ./src/open_clip/tokenizer.py | https://github.com/openai/CLIP | openai/CLIP地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/tokenizer.py | ./src/open_clip/tokenizer.py | https://stackoverflow.com/q/62691279 | stackoverflow地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/timm_model.py | ./src/open_clip/timm_model.py | https://github.com/rwightman/pytorch-image-models | 源码地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/push_to_hf_hub.py | ./src/open_clip/push_to_hf_hub.py | https://huggingface.co/openai/clip-vit-large-patch14 | huggingface.co公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/openai.py | ./src/open_clip/openai.py | https://github.com/openai/CLIP | openai/CLIP地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/model.py | ./src/open_clip/model.py | https://github.com/openai/CLIP | openai/CLIP地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/model.py | ./src/open_clip/model.py | https://arxiv.org/abs/2205.01580 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/model.py | ./src/open_clip/model.py | https://arxiv.org/abs/2111.07991 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/model.py | ./src/open_clip/model.py | https://arxiv.org/abs/2111.07991 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_model.py | ./src/open_clip/hf_model.py | https://github.com/huggingface/transformers | huggingface公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_configs.py | ./src/open_clip/hf_configs.py | https://huggingface.co/docs/transformers/model_doc/roberta#roberta | huggingface.co模型文档公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_configs.py | ./src/open_clip/hf_configs.py | https://huggingface.co/docs/transformers/model_doc/xlm-roberta#transformers.XLMRobertaConfig | huggingface.co模型文档公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_configs.py | ./src/open_clip/hf_configs.py | https://huggingface.co/docs/transformers/model_doc/mt5#mt5 | huggingface.co模型文档公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_configs.py | ./src/open_clip/hf_configs.py | https://github.com/google-research/text-to-text-transfer-transformer/issues/273 | text-to-text-transfer-transformer公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_configs.py | ./src/open_clip/hf_configs.py | https://github.com/huggingface/transformers/blob/v4.24.0/src/transformers/models/t5/modeling_t5.py#L374 | huggingface模型文件公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/hf_configs.py | ./src/open_clip/hf_configs.py | https://huggingface.co/docs/transformers/model_doc/bert | huggingface.co模型文档公网来源说明 | -| 开源代码引入 | https://github.com/mlfoundations/open_clip/blob/main/src/open_clip/coca_model.py | ./src/open_clip/coca_model.py | https://huggingface.co/docs/transformers/main/en/main_classes/text_generation | huggingface.co文档公网来源说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/be1cfb55d75a9666199fb2206c106743da0f6468c9d327f3e0d0a543a9919d9c/RN50x64.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/b8cca3fd41ae0c99ba7e8951adf17d267cdb84cd88be6f7c2e0eca1737a03836/ViT-L-14.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/afeb0e10f9e5a86da6080e35cf09123aca3b358a0c3e3b6c78a7b63bc04b6762/RN50.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/8fa8567bab74a42d41c5915025a8e4538c3bdbe8804a470a72f30b0d94fab599/RN101.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/7e526bd135e493cef0776de27d5f42653e6b4c8bf9e0f653bb11773263205fdd/RN50x4.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/5806e77cd80f8b59890b7e101eabd078d9fb84e6937f9e85e4ecb61988df416f/ViT-B-16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/52378b407f34354e150460fe41077663dd5b39c54cd0bfd2b27167a4a06ec9aa/RN50x16.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/40d365715913c9da98579312b702a82c18be219cc2a73407c4526f58eba950af/ViT-B-32.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenCLIP_for_PyTorch/src/open_clip/pretrained.py | https://openaipublic.azureedge.net/clip/models/3035c92b350959924f9f00213499208652fc7ea050643e8b385c2dac08641f02/ViT-L-14-336px.pt | 模型地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/OpenSora-master/public_address_statement.md b/PyTorch/built-in/mm/OpenSora-master/public_address_statement.md index 68fc2c5bee8ba8ed24c2dbd53c8cf957cb81c2c6..c673d0ed4263f33cdbd3e19d22e64119a0ce2560 100644 --- a/PyTorch/built-in/mm/OpenSora-master/public_address_statement.md +++ b/PyTorch/built-in/mm/OpenSora-master/public_address_statement.md @@ -1,16 +1,27 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |-----------------------------------------------------------------------|---------------------------------------------------------------------|------------------------|--------------| -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-512x512.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-256x256.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/maxin-cn/Latte/resolve/main/ucf101.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-SAM-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-512x512.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-1024-MS.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-16x256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-HQ-16x256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-HQ-16x512x512.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-Sigma/resolve/main/PixArt-Sigma-XL-2-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-Sigma/resolve/main/PixArt-Sigma-XL-2-512-MS.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-Sigma/resolve/main/PixArt-Sigma-XL-2-1024-MS.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-Sigma/resolve/main/PixArt-Sigma-XL-2-2K-MS.pth | 模型权重公网下载地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/configs/opensora-v1-1/inference/sample-ref.py | https://cdn.pixabay.com/video/2021/04/25/72171-542991404_large.mp4 | prompt地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/configs/opensora-v1-1/inference/sample-ref.py | https://cdn.openai.com/tmp/s/interp/d0.mp4 | prompt地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/docker/Dockerfile | https://download.pytorch.org/whl/cu121 | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/eval/vae/flolpips/flolpips.py | https://raw.githubusercontent.com/danier97/flolpips/main/weights/v0.1/alex.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/eval/vae/flolpips/pwcnet.py | http://content.sniklaus.com/github/pytorch-pwc/network- | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://github.com/hpcaitech/Open-Sora/stargazers | 相关html配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/github/stars/hpcaitech/Open-Sora?style=social | 相关html配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://twitter.com/yangyou1991/status/1769411544083996787?s=61&t=jT0Dsx2d-MS5vS9rNM5e5g | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/badge/Twitter-Discuss-blue?logo=twitter& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://raw.githubusercontent.com/hpcaitech/public_assets/main/colossalai/img/WeChat.png | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/badge/微信-小助手加群-green?logo=wechat& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://join.slack.com/t/colossalaiworkspace/shared_invite/zt-247ipg9fk-KRRYmUl~u2ll2637WRURVA | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/badge/Slack-ColossalAI-blueviolet?logo=slack& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://hpcaitech.github.io/Open-Sora/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/badge/Gallery-View-orange?logo=& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://hpc-ai.com/blog/open-sora-v1.0 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/badge/Open_Sora-Blog-blue | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://discord.gg/kZakZzrSUT | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/gradio/app.py | https://img.shields.io/badge/Discord-join-blueviolet?logo=discord& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/opensora/models/vae/lpips.py | https://heibox.uni-heidelberg.de/f/607503859c864bc1b30b/?dl=1 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/opensora/utils/ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-512x512.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/opensora/utils/ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-256x256.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/tools/caption/caption_gpt4.py | https://api.openai.com/v2/chat/completions | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/tools/frame_interpolation/utils/flow_utils.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora-master/tools/scoring/ocr/dbnetpp.py | https://download.openmmlab.com/mmocr/textdet/dbnetpp/ | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/OpenSora1.0/public_address_statement.md b/PyTorch/built-in/mm/OpenSora1.0/public_address_statement.md index a52610cb8073c3c616800d536ca65cfbb9b73b2e..151a0a8d725b501fcc8aa9ee3fe7dee29e673c38 100644 --- a/PyTorch/built-in/mm/OpenSora1.0/public_address_statement.md +++ b/PyTorch/built-in/mm/OpenSora1.0/public_address_statement.md @@ -1,12 +1,19 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |-----------------------------------------------------------------------|---------------------------------------------------------------------|------------------------|--------------| -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-512x512.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-256x256.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/maxin-cn/Latte/resolve/main/ucf101.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-SAM-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-512x512.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-1024-MS.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-16x256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-HQ-16x256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-HQ-16x512x512.pth | 模型权重公网下载地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://github.com/hpcaitech/Open-Sora/stargazers | 相关html配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/github/stars/hpcaitech/Open-Sora?style=social | 相关html配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://twitter.com/yangyou1991/status/1769411544083996787?s=61&t=jT0Dsx2d-MS5vS9rNM5e5g | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/badge/Twitter-Discuss-blue?logo=twitter& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://raw.githubusercontent.com/hpcaitech/public_assets/main/colossalai/img/WeChat.png | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/badge/微信-小助手加群-green?logo=wechat& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://join.slack.com/t/colossalaiworkspace/shared_invite/zt-247ipg9fk-KRRYmUl~u2ll2637WRURVA | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/badge/Slack-ColossalAI-blueviolet?logo=slack& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://hpcaitech.github.io/Open-Sora/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/badge/Gallery-View-orange?logo=& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://hpc-ai.com/blog/open-sora-v1.0 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/badge/Open_Sora-Blog-blue | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://discord.gg/kZakZzrSUT | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/gradio/app.py | https://img.shields.io/badge/Discord-join-blueviolet?logo=discord& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/opensora/utils/ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-512x512.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/opensora/utils/ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-256x256.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.0/tools/caption/caption_gpt4.py | https://api.openai.com/v1/chat/completions | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/OpenSora1.1/public_address_statement.md b/PyTorch/built-in/mm/OpenSora1.1/public_address_statement.md index a52610cb8073c3c616800d536ca65cfbb9b73b2e..2f7c66d37700004be0d82ea4d72668cc24d837c1 100644 --- a/PyTorch/built-in/mm/OpenSora1.1/public_address_statement.md +++ b/PyTorch/built-in/mm/OpenSora1.1/public_address_statement.md @@ -1,12 +1,21 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |-----------------------------------------------------------------------|---------------------------------------------------------------------|------------------------|--------------| -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-512x512.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-256x256.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/maxin-cn/Latte/resolve/main/ucf101.pt | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-SAM-256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-512x512.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/resolve/main/PixArt-XL-2-1024-MS.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-16x256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-HQ-16x256x256.pth | 模型权重公网下载地址 | -| 开源代码引入 | https://github.com/hpcaitech/Open-Sora/blob/main/opensora/utils/ckpt_utils.py | .\opensora\utils\ckpt_utils.py | https://huggingface.co/hpcai-tech/Open-Sora/resolve/main/OpenSora-v1-HQ-16x512x512.pth | 模型权重公网下载地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://github.com/hpcaitech/Open-Sora/stargazers | 相关html配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/github/stars/hpcaitech/Open-Sora?style=social | 相关html配置 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://twitter.com/yangyou1991/status/1769411544083996787?s=61&t=jT0Dsx2d-MS5vS9rNM5e5g | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/badge/Twitter-Discuss-blue?logo=twitter& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://raw.githubusercontent.com/hpcaitech/public_assets/main/colossalai/img/WeChat.png | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/badge/微信-小助手加群-green?logo=wechat& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://join.slack.com/t/colossalaiworkspace/shared_invite/zt-247ipg9fk-KRRYmUl~u2ll2637WRURVA | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/badge/Slack-ColossalAI-blueviolet?logo=slack& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://hpcaitech.github.io/Open-Sora/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/badge/Gallery-View-orange?logo=& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://hpc-ai.com/blog/open-sora-v1.0 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/badge/Open_Sora-Blog-blue | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://discord.gg/kZakZzrSUT | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/gradio/app.py | https://img.shields.io/badge/Discord-join-blueviolet?logo=discord& | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/opensora/utils/ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-512x512.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/opensora/utils/ckpt_utils.py | https://dl.fbaipublicfiles.com/DiT/models/DiT-XL-2-256x256.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/tools/caption/caption_gpt4.py | https://api.openai.com/v3/chat/completions | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/tools/frame_interpolation/utils/flow_utils.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSora1.1/tools/scoring/ocr/dbnetpp.py | https://download.openmmlab.com/mmocr/textdet/dbnetpp/ | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/OpenSoraPlan1.0/public_address_statement.md b/PyTorch/built-in/mm/OpenSoraPlan1.0/public_address_statement.md index 7ab246433e4edb6085e9f4f9ab7c83d2298bcba5..56e6e5f29472c536f66335d753cc238fc530d08d 100644 --- a/PyTorch/built-in/mm/OpenSoraPlan1.0/public_address_statement.md +++ b/PyTorch/built-in/mm/OpenSoraPlan1.0/public_address_statement.md @@ -1,163 +1,20 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |------|---------------------------------------------------------------------|--------------------------------------------------|---------| -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/flolpips/pwcnet.py | ./opensora/eval/flolpips/pwcnet.py | http://content.sniklaus.com/github/pytorch-pwc/network-' | 下载预训练文件 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/pytorch_i3d.py| ./opensora/eval/fvd/videogpt/pytorch_i3d.py | http://arxiv.org/pdf/1409.4842v1.pdf. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/frame_interpolation/utils/flow_utils.py| ./opensora/models/frame_interpolation/utils/flow_utils.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/dataset/transform.py| ./opensora/dataset/transform.py | https://github.com/openai/guided-diffusion/blob/8fb3ad9197f16bbc40620447b2742e13458d2831/guided_diffusion/image_datasets.py#L126 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py| ./opensora/eval/eval_clip_score.py | https://github.com/openai/CLIP | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py| ./opensora/eval/eval_clip_score.py | https://github.com/mseitzer/pytorch-fid | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py| ./opensora/eval/eval_clip_score.py | https://github.com/openai/guided-diffusion/blob/8fb3ad9197f16bbc40620447b2742e13458d2831/guided_diffusion/image_datasets.py#L126 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py | ./opensora/eval/eval_clip_score.py | https://github.com/openai/CLIP | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py | ./opensora/eval/eval_clip_score.py | https://github.com/mseitzer/pytorch-fid | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/styleganv/fvd.py| ./opensora/eval/fvd/styleganv/fvd.py | https://www.dropbox.com/s/ge9e5ujwgetktms/i3d_torchscript.pt | 预训练权重 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/styleganv/fvd.py| ./opensora/eval/fvd/styleganv/fvd.py | https://github.com/universome/fvd-comparison | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/styleganv/fvd.py| ./opensora/eval/fvd/styleganv/fvd.py | https://github.com/cvpr2022-stylegan-v/stylegan-v/blob/main/src/metrics/frechet_video_distance.py | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py| ./opensora/eval/fvd/videogpt/fvd.py | https://onedrive.live.com/download?cid=78EEF3EB6AE7DBCB&resid=78EEF3EB6AE7DBCB%21199&authkey=AApKdFHPXzWLNyI" | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py| ./opensora/eval/fvd/videogpt/fvd.py | https://github.com/tensorflow/gan/blob/de4b8da3853058ea380a6152bd3bd454013bf619/tensorflow_gan/python/eval/classifier_metrics.py#L161 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py| ./opensora/eval/fvd/videogpt/fvd.py | https://onedrive.live.com/download?cid=78EEF3EB6AE7DBCB&resid=78EEF3EB6AE7DBCB%21199&authkey=AApKdFHPXzWLNyI" | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py| ./opensora/eval/fvd/videogpt/fvd.py | https://github.com/tensorflow/gan/blob/de4b8da3853058ea380a6152bd3bd454013bf619/tensorflow_gan/python/eval/classifier_metrics.py#L400 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py| ./opensora/eval/fvd/videogpt/fvd.py | https://github.com/tensorflow/gan/blob/de4b8da3853058ea380a6152bd3bd454013bf619/tensorflow_gan/python/eval/classifier_metrics.py#L400 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/pytorch_i3d.py| ./opensora/eval/fvd/videogpt/pytorch_i3d.py | https://github.com/piergiaj/pytorch-i3d | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/pytorch_i3d.py| ./opensora/eval/fvd/videogpt/pytorch_i3d.py | https://arxiv.org/pdf/1705.07750v1.pdf. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py| ./opensora/models/ae/imagebase/vqvae/quantize.py | https://github.com/MishaLaskin/vqvae/blob/d761a999e2267766400dc646d82d3ac3657771d4/models/quantizer.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py| ./opensora/models/ae/imagebase/vqvae/quantize.py | https://github.com/karpathy/deep-vector-quantization/blob/main/model.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py| ./opensora/models/ae/imagebase/vqvae/quantize.py | https://arxiv.org/abs/1611.01144 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py| ./opensora/models/ae/imagebase/vqvae/quantize.py | https://github.com/MishaLaskin/vqvae/blob/d761a999e2267766400dc646d82d3ac3657771d4/models/quantizer.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py| ./opensora/models/ae/imagebase/vqvae/quantize.py | https://github.com/karpathy/deep-vector-quantization/blob/main/model.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py | ./opensora/models/ae/imagebase/vqvae/quantize.py | https://arxiv.org/abs/1611.01144 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://arxiv.org/abs/1904.10509 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | ./opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/losses/discriminator.py| ./opensora/models/ae/videobase/losses/discriminator.py | https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/models/networks.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/losses/lpips.py| ./opensora/models/ae/videobase/losses/lpips.py | https://github.com/richzhang/PerceptualSimilarity/tree/master/models""" | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/losses/perceptual_loss.py| ./opensora/models/ae/videobase/losses/perceptual_loss.py | https://github.com/karpathy/deep-vector-quantization/blob/main/model.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/modules/attention.py| ./opensora/models/ae/videobase/modules/attention.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan/pull/172. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py| ./opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://arxiv.org/abs/1904.10509 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/__init__.py| ./opensora/models/diffusion/diffusion/__init__.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/__init__.py| ./opensora/models/diffusion/diffusion/__init__.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/__init__.py| ./opensora/models/diffusion/diffusion/__init__.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/diffusion_utils.py| ./opensora/models/diffusion/diffusion/diffusion_utils.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/diffusion_utils.py| ./opensora/models/diffusion/diffusion/diffusion_utils.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/diffusion_utils.py| ./opensora/models/diffusion/diffusion/diffusion_utils.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/respace.py| ./opensora/models/diffusion/diffusion/respace.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/respace.py| ./opensora/models/diffusion/diffusion/respace.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/respace.py| ./opensora/models/diffusion/diffusion/respace.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/timestep_sampler.py| ./opensora/models/diffusion/diffusion/timestep_sampler.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/timestep_sampler.py| ./opensora/models/diffusion/diffusion/timestep_sampler.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/timestep_sampler.py| ./opensora/models/diffusion/diffusion/timestep_sampler.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py| ./opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py#L42 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modeling_latte.py| ./opensora/models/diffusion/latte/modeling_latte.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modeling_latte.py| ./opensora/models/diffusion/latte/modeling_latte.py | https://github.com/openai/glide-text2im/blob/main/notebooks/text2im.ipynb | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modeling_latte.py| ./opensora/models/diffusion/latte/modeling_latte.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py#L164C9-L168C29 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/master/diffusion/model/nets/PixArt_blocks.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py#L162C151-L162C160 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://github.com/facebookresearch/xformers | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py#L70C1-L76C103 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py#L70C1-L76C103 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py| ./opensora/models/diffusion/latte/modules.py | https://arxiv.org/abs/2310.00426 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/pos.py| ./opensora/models/diffusion/latte/pos.py | https://github.com/facebookresearch/mae/blob/main/util/pos_embed.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/pos.py| ./opensora/models/diffusion/latte/pos.py | https://github.com/baaivision/EVA | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/pos.py| ./opensora/models/diffusion/latte/pos.py | https://spaces.ac.cn/archives/8265 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/pos.py| ./opensora/models/diffusion/latte/pos.py | https://spaces.ac.cn/archives/8397 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/pos.py| ./opensora/models/diffusion/latte/pos.py | https://spaces.ac.cn/archives/8397 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/utils/pos_embed.py| ./opensora/models/diffusion/utils/pos_embed.py | https://github.com/naver/croco | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/utils/pos_embed.py| ./opensora/models/diffusion/utils/pos_embed.py | https://github.com/huggingface/diffusers | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/frame_interpolation/interpolation.py| ./opensora/models/frame_interpolation/interpolation.py | https://github.com/MCG-NKU/AMT/blob/main/demos/demo_2x.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/arch_util.py| ./opensora/models/super_resolution/basicsr/archs/arch_util.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/weight_init.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/arch_util.py| ./opensora/models/super_resolution/basicsr/archs/arch_util.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/arch_util.py| ./opensora/models/super_resolution/basicsr/archs/arch_util.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/weight_init.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/rgt_arch.py| ./opensora/models/super_resolution/basicsr/archs/rgt_arch.py | https://github.com/cheerss/CrossFormer/blob/main/models/crossformer.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/rgt_arch.py| ./opensora/models/super_resolution/basicsr/archs/rgt_arch.py | https://github.com/zhengchen1999/CAT/blob/main/basicsr/archs/cat_arch.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/rgt_arch.py| ./opensora/models/super_resolution/basicsr/archs/rgt_arch.py | https://github.com/microsoft/Swin-Transformer/blob/main/models/swin_transformer.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/data_util.py| ./opensora/models/super_resolution/basicsr/data/data_util.py | https://lmdb.readthedocs.io/en/release/ | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py| ./opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | https://stackoverflow.com/questions/7323664/python-generator-pre-fetch | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py| ./opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | https://github.com/IgorSusmelj/pytorch-styleguide/issues/5# | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py| ./opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | https://github.com/NVIDIA/apex/issues/304# | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py| ./opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py | https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py| ./opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py | https://ece.uwaterloo.ca/~z70wang/research/ssim/. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/dist_util.py| ./opensora/models/super_resolution/basicsr/utils/dist_util.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/dist_utils.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/file_client.py | ./opensora/models/super_resolution/basicsr/utils/file_client.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/fileio/file_client.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py| ./opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion. | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/registry.py| ./opensora/models/super_resolution/basicsr/utils/registry.py | https://github.com/facebookresearch/fvcore/blob/master/fvcore/common/registry.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5EncoderModel | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/tree/main/t5-v1_1-xxl | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://huggingface.co/docs/transformers/model_doc/t5#transformers.T5Tokenizer | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/master/diffusion/model/utils.py | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://arxiv.org/abs/2010.02502 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://arxiv.org/abs/2207.12598 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://arxiv.org/pdf/2205.11487.pdf | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://arxiv.org/abs/2010.02502 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://pillow.readthedocs.io/en/stable/ | 图片示例 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py| ./opensora/sample/pipeline_videogen.py | https://arxiv.org/pdf/2205.11487.pdf | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/transport_sample.py| ./opensora/sample/transport_sample.py | https://github.com/rtqichen/torchdiffeq | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://www.pnglog.com/AOuPMh.png | 图片示例 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://img.shields.io/badge/Github-Code-blue | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://img.shields.io/badge/Github-Code-blue | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan/stargazers | 项目引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://img.shields.io/github/stars/PKU-YuanGroup/Open-Sora-Plan.svg?style=social | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan/stargazers | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py| ./opensora/serve/gradio_utils.py | https://img.shields.io/github/stars/PKU-YuanGroup/Open-Sora-Plan.svg?style=social | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_feature.py | ./opensora/train/train_t2v_feature.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_feature.py | ./opensora/train/train_t2v_feature.py | https://arxiv.org/abs/2303.09556 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_feature.py | ./opensora/train/train_t2v_feature.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_feature.py | ./opensora/train/train_t2v_feature.py | https://www.tensorflow.org/tensorboard | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_t5_feature.py | ./opensora/train/train_t2v_t5_feature.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_t5_feature.py | ./opensora/train/train_t2v_t5_feature.py | https://arxiv.org/abs/2303.09556| 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_t5_feature.py | ./opensora/train/train_t2v_t5_feature.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v_t5_feature.py | ./opensora/train/train_t2v_t5_feature.py | https://www.tensorflow.org/tensorboard | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py| ./opensora/train/train_t2v.py | https://pytorch.org/docs/stable/notes/cudahtml#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py| ./opensora/train/train_t2v.py | https://arxiv.org/abs/2303.09556 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py| ./opensora/train/train_t2v.py | https://pytorch.org/docs/stable/notes/cudahtml#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py| ./opensora/train/train_t2v.py | https://www.tensorflow.org/tensorboard | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train.py| ./opensora/train/train.py | https://pytorch.org/docs/stable/notes/cudahtml#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train.py| ./opensora/train/train.py | https://arxiv.org/abs/2303.09556 | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train.py| ./opensora/train/train.py | https://pytorch.org/docs/stable/notes/cudahtml#tensorfloat-32-tf32-on-ampere-devices | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train.py| ./opensora/train/train.py | https://www.tensorflow.org/tensorboard | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/utils/dataset_utils.py| ./opensora/utils/dataset_utils.py | https://github.com/dmlc/decord | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/utils/taming_download.py| ./opensora/utils/taming_download.py | https://github.com/CompVis/taming-transformers.git | 论文引导 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/utils/taming_download.py| ./opensora/utils/taming_download.py | https://heibox.uni-heidelberg.de/f/607503859c864bc1b30b/?dl=1 | 论文引导 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------|--------------------------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/eval/flolpips/pwcnet.py | http://content.sniklaus.com/github/pytorch-pwc/network- | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/eval/fvd/styleganv/fvd.py | https://www.dropbox.com/s/ge9e5ujwgetktms/i3d_torchscript.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/eval/fvd/videogpt/fvd.py | https://onedrive.live.com/download?cid=78EEF3EB6AE7DBCB&resid=78EEF3EB6AE7DBCB%21199&authkey=AApKdFHPXzWLNyI | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/models/frame_interpolation/utils/flow_utils.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/serve/gradio_utils.py | https://www.pnglog.com/AOuPMh.png | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v_feature.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v_feature.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v_feature.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v_t5_feature.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v_t5_feature.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/train/train_t2v_t5_feature.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.0/opensora/utils/taming_download.py | https://heibox.uni-heidelberg.de/f/607503859c864bc1b30b/?dl=1 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/OpenSoraPlan1.1/public_address_statement.md b/PyTorch/built-in/mm/OpenSoraPlan1.1/public_address_statement.md index 5461fa98ac4f95645806fa08deeb87b58af22e22..67e0bb922493fdda795bd9a98c690a871670f226 100644 --- a/PyTorch/built-in/mm/OpenSoraPlan1.1/public_address_statement.md +++ b/PyTorch/built-in/mm/OpenSoraPlan1.1/public_address_statement.md @@ -1,146 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------| -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/dataset/transform.py | /opensora/dataset/transform.py | https://github.com/openai/guided-diffusion/blob/8fb3ad9197f16bbc40620447b2742e13458d2831/guided_diffusion/image_datasets.py | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py | /opensora/eval/eval_clip_score.py | https://github.com/openai/CLIP | code from openai/CLIP. | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py | /opensora/eval/eval_clip_score.py | https://github.com/mseitzer/pytorch-fid | code from mseitzer/pytorch-fid | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py | /opensora/eval/eval_clip_score.py | https://github.com/openai/CLIP | code from openai/CLIP. | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_clip_score.py | /opensora/eval/eval_clip_score.py | http://www.apache.org/licenses/LICENSE-2.0 | license | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_common_metric.py | /opensora/eval/eval_common_metric.py | https://github.com/openai/CLIP | code from openai/CLIP. | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_common_metric.py | /opensora/eval/eval_common_metric.py | https://github.com/mseitzer/pytorch-fid | code from mseitzer/pytorch-fid | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_common_metric.py | /opensora/eval/eval_common_metric.py | https://github.com/openai/CLIP | code from openai/CLIP. | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/eval_common_metric.py | /opensora/eval/eval_common_metric.py | http://www.apache.org/licenses/LICENSE-2.0 | license | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/flolpips/pwcnet.py | /opensora/eval/flolpips/pwcnet.py | http://content.sniklaus.com/github/pytorch-pwc/network-default.pytorch | download weights | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/styleganv/fvd.py | /opensora/eval/fvd/styleganv/fvd.py | https://github.com/universome/fvd-comparison | code from universome/fvd-comparison | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/styleganv/fvd.py | /opensora/eval/fvd/styleganv/fvd.py | https://www.dropbox.com/s/ge9e5ujwgetktms/i3d_torchscript.pt | download weights | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/styleganv/fvd.py | /opensora/eval/fvd/styleganv/fvd.py | https://github.com/cvpr2022-stylegan-v/stylegan-v/blob/main/src/metrics/frechet_video_distance.py | code from cvpr2022-stylegan-v/stylegan-v | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py | /opensora/eval/fvd/videogpt/fvd.py | https://onedrive.live.com/download?cid=78EEF3EB6AE7DBCB&resid=78EEF3EB6AE7DBCB%21199&authkey=AApKdFHPXzWLNyI | download weights | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py | /opensora/eval/fvd/videogpt/fvd.py | https://github.com/tensorflow/gan/blob/de4b8da3853058ea380a6152bd3bd454013bf619/tensorflow_gan/python/eval/classifier_metrics.py | code from tensorflow/gan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py | /opensora/eval/fvd/videogpt/fvd.py | https://github.com/tensorflow/gan/blob/de4b8da3853058ea380a6152bd3bd454013bf619/tensorflow_gan/python/eval/classifier_metrics.py | code from tensorflow/gan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/fvd.py | /opensora/eval/fvd/videogpt/fvd.py | https://discuss.pytorch.org/t/covariance-and-gradient-support/16217/2 | code from ModarTensai | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/pytorch_i3d.py | /opensora/eval/fvd/videogpt/pytorch_i3d.py | https://github.com/piergiaj/pytorch-i3d | code from piergiaj/pytorch-i3d | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/pytorch_i3d.py | /opensora/eval/fvd/videogpt/pytorch_i3d.py | https://arxiv.org/pdf/1705.07750v1.pdf | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/eval/fvd/videogpt/pytorch_i3d.py | /opensora/eval/fvd/videogpt/pytorch_i3d.py | http://arxiv.org/pdf/1409.4842v1.pdf | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py | /opensora/models/ae/imagebase/vqvae/quantize.py | https://github.com/MishaLaskin/vqvae/blob/d761a999e2267766400dc646d82d3ac3657771d4/models/quantizer.py | code from MishaLaskin/vqvae | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py | /opensora/models/ae/imagebase/vqvae/quantize.py | https://github.com/karpathy/deep-vector-quantization/blob/main/model.py | code from karpathy/deep-vector-quantization | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/imagebase/vqvae/quantize.py | /opensora/models/ae/imagebase/vqvae/quantize.py | https://arxiv.org/abs/1611.01144 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://arxiv.org/abs/1904.10509 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | /opensora/models/ae/videobase/causal_vqvae/modeling_causalvqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/losses/discriminator.py | /opensora/models/ae/videobase/losses/discriminator.py | https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/models/networks.py | code from junyanz/pytorch-CycleGAN-and-pix2pix | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/losses/lpips.py | /opensora/models/ae/videobase/losses/lpips.py | https://github.com/richzhang/PerceptualSimilarity/tree/master/models | code from richzhang/PerceptualSimilarity | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/losses/perceptual_loss.py | /opensora/models/ae/videobase/losses/perceptual_loss.py | https://github.com/karpathy/deep-vector-quantization/blob/main/model.py | code from karpathy/deep-vector-quantization | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/modules/attention.py | /opensora/models/ae/videobase/modules/attention.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan/pull/172 | code from PKU-YuanGroup/Open-Sora-Plan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://arxiv.org/abs/1904.10509 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/ae/videobase/vqvae/modeling_vqvae.py | /opensora/models/ae/videobase/vqvae/modeling_vqvae.py | https://github.com/wilson1yan/VideoGPT | code from wilson1yan/VideoGPT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/captioner/caption_refiner/demo_for_refiner.py | /opensora/models/captioner/caption_refiner/demo_for_refiner.py | https://one-api.bltcy.top/v1 | openai api | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/diffusion_utils.py | /opensora/models/diffusion/diffusion/diffusion_utils.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | code from openai/glide-text2im | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/diffusion_utils.py | /opensora/models/diffusion/diffusion/diffusion_utils.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/diffusion_utils.py | /opensora/models/diffusion/diffusion/diffusion_utils.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | code from openai/improved-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py | /opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | code from openai/glide-text2im | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py | /opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py | /opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | code from openai/improved-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion.py | /opensora/models/diffusion/diffusion/gaussian_diffusion.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py | code from hojonathanho/diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | /opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | code from openai/glide-text2im | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | /opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | /opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | code from openai/improved-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | /opensora/models/diffusion/diffusion/gaussian_diffusion_t2v.py | https://github.com/hojonathanho/diffusion/blob/1e0dceb3b3495bbe19116a5e1b3596cd0706c543/diffusion_tf/diffusion_utils_2.py | code from hojonathanho/diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/respace.py | /opensora/models/diffusion/diffusion/respace.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | code from openai/glide-text2im | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/respace.py | /opensora/models/diffusion/diffusion/respace.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/respace.py | /opensora/models/diffusion/diffusion/respace.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | code from openai/improved-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/timestep_sampler.py | /opensora/models/diffusion/diffusion/timestep_sampler.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | code from openai/glide-text2im | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/timestep_sampler.py | /opensora/models/diffusion/diffusion/timestep_sampler.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/timestep_sampler.py | /opensora/models/diffusion/diffusion/timestep_sampler.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | code from openai/improved-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/__init__.py | /opensora/models/diffusion/diffusion/__init__.py | https://github.com/openai/glide-text2im/blob/main/glide_text2im/gaussian_diffusion.py | code from openai/glide-text2im | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/__init__.py | /opensora/models/diffusion/diffusion/__init__.py | https://github.com/openai/guided-diffusion/blob/main/guided_diffusion | code from openai/guided-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/diffusion/__init__.py | /opensora/models/diffusion/diffusion/__init__.py | https://github.com/openai/improved-diffusion/blob/main/improved_diffusion/gaussian_diffusion.py | code from openai/improved-diffusion | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modeling_latte.py | /opensora/models/diffusion/latte/modeling_latte.py | https://github.com/huggingface/diffusers/blob/main/src/diffusers/models/attention_processor.py | code from huggingface/diffusers | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/master/diffusion/model/nets/PixArt_blocks.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/facebookresearch/xformers | code from facebookresearch/xformers | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/0f55e922376d8b797edd44d25d0e7464b260dcab/diffusion/model/nets/PixArtMS.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/latte/modules.py | /opensora/models/diffusion/latte/modules.py | https://arxiv.org/abs/2310.00426 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/utils/pos_embed.py | /opensora/models/diffusion/utils/pos_embed.py | https://github.com/naver/croco | code from naver/croco | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/utils/pos_embed.py | /opensora/models/diffusion/utils/pos_embed.py | https://github.com/huggingface/diffusers | code from huggingface/diffusers | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/utils/pos_embed.py | /opensora/models/diffusion/utils/pos_embed.py | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | code from huggingface/transformers | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/diffusion/utils/pos_embed.py | /opensora/models/diffusion/utils/pos_embed.py | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | code from huggingface/transformers | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/frame_interpolation/interpolation.py | /opensora/models/frame_interpolation/interpolation.py | https://github.com/MCG-NKU/AMT/blob/main/demos/demo_2x.py | code from MCG-NKU/AMT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/frame_interpolation/utils/flow_utils.py | /opensora/models/frame_interpolation/utils/flow_utils.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | paper A Database and Evaluation Methodology for Optical Flow | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/arch_util.py | /opensora/models/super_resolution/basicsr/archs/arch_util.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/weight_init.py | code from rwightman/pytorch-image-models | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/arch_util.py | /opensora/models/super_resolution/basicsr/archs/arch_util.py | https://people.sc.fsu.edu/ | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/arch_util.py | /opensora/models/super_resolution/basicsr/archs/arch_util.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/weight_init.py | code from rwightman/pytorch-image-models | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/rgt_arch.py | /opensora/models/super_resolution/basicsr/archs/rgt_arch.py | https://github.com/cheerss/CrossFormer/blob/main/models/crossformer.py | code from cheerss/CrossFormer | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/rgt_arch.py | /opensora/models/super_resolution/basicsr/archs/rgt_arch.py | https://github.com/zhengchen1999/CAT/blob/main/basicsr/archs/cat_arch.py | code from zhengchen1999/CAT | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/archs/rgt_arch.py | /opensora/models/super_resolution/basicsr/archs/rgt_arch.py | https://github.com/microsoft/Swin-Transformer/blob/main/models/swin_transformer.py | code from microsoft/Swin-Transformer | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/data_util.py | /opensora/models/super_resolution/basicsr/data/data_util.py | https://lmdb.readthedocs.io/en/release/ | introduce imdb | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | /opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | https://stackoverflow.com/questions/7323664/python-generator-pre-fetch | code from Winston Ewert | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | /opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | https://github.com/IgorSusmelj/pytorch-styleguide/issues/5 | code from IgorSusmelj/pytorch-styleguide | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | /opensora/models/super_resolution/basicsr/data/prefetch_dataloader.py | https://github.com/NVIDIA/apex/issues/304 | code from NVIDIA/apex | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py | /opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py | https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio | wiki for Peak_signal-to-noise_ratio | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py | /opensora/models/super_resolution/basicsr/metrics/psnr_ssim.py | https://ece.uwaterloo.ca/~z70wang/research/ssim/ | introduce ssim | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/dist_util.py | /opensora/models/super_resolution/basicsr/utils/dist_util.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/runner/dist_utils.py | code from open-mmlab/mmcv | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/file_client.py | /opensora/models/super_resolution/basicsr/utils/file_client.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/fileio/file_client.py | code from open-mmlab/mmcv | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/matlab_functions.py | /opensora/models/super_resolution/basicsr/utils/matlab_functions.py | https://en.wikipedia.org/wiki/YCbCr | wiki for YCbCr | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/models/super_resolution/basicsr/utils/registry.py | /opensora/models/super_resolution/basicsr/utils/registry.py | https://github.com/facebookresearch/fvcore/blob/master/fvcore/common/registry.py | code from facebookresearch/fvcore | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | http://www.apache.org/licenses/LICENSE-2.0 | license | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://huggingface.co/docs/transformers/model_doc/t5 | introduce t5 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://huggingface.co/PixArt-alpha/PixArt-alpha/tree/main/t5-v1_1-xxl | download weights | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://huggingface.co/docs/transformers/model_doc/t5 | introduce t5 | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://github.com/PixArt-alpha/PixArt-alpha/blob/master/diffusion/model/utils.py | code from PixArt-alpha/PixArt-alpha | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://arxiv.org/abs/2010.02502 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://arxiv.org/abs/2207.12598 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://arxiv.org/pdf/2205.11487.pdf | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://arxiv.org/abs/2010.02502 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://pytorch.org/docs/stable/generated/torch.Generator.html | introduce torch.Generator | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://pillow.readthedocs.io/en/stable/ | introduce pillow | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/pipeline_videogen.py | /opensora/sample/pipeline_videogen.py | https://arxiv.org/pdf/2205.11487.pdf | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/sample/transport_sample.py | /opensora/sample/transport_sample.py | https://github.com/rtqichen/torchdiffeq | code from rtqichen/torchdiffeq | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://www.pnglog.com/AOuPMh.png | web server image | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | code from PKU-YuanGroup/Open-Sora-Plan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | code from PKU-YuanGroup/Open-Sora-Plan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan | code from PKU-YuanGroup/Open-Sora-Plan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://img.shields.io/badge/Github-Code-blue | web server image | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://github.com/PKU-YuanGroup/Open-Sora-Plan/stargazers | code from PKU-YuanGroup/Open-Sora-Plan | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/serve/gradio_utils.py | /opensora/serve/gradio_utils.py | https://img.shields.io/github/stars/PKU-YuanGroup/Open-Sora-Plan.svg?style=social | web server image | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py | /opensora/train/train_t2v.py | https://pytorch.org/docs/stable/notes/cuda.html | introduce TF32 on Ampere GPUs | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py | /opensora/train/train_t2v.py | https://arxiv.org/abs/2303.09556 | paper | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py | /opensora/train/train_t2v.py | https://pytorch.org/docs/stable/notes/cuda.html | introduce cuda | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/train/train_t2v.py | /opensora/train/train_t2v.py | https://www.tensorflow.org/tensorboard | introduce TF32 on Ampere GPUs | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/utils/dataset_utils.py | /opensora/utils/dataset_utils.py | https://github.com/dmlc/decord | code from dmlc/decord | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/utils/taming_download.py | /opensora/utils/taming_download.py | https://github.com/CompVis/taming-transformers.git | code from CompVis/taming-transformers.git | -| 开源代码引入 | https://github.com/PKU-YuanGroup/Open-Sora-Plan/opensora/utils/taming_download.py | /opensora/utils/taming_download.py | https://heibox.uni-heidelberg.de/f/607503859c864bc1b30b/?dl=1 | download weights | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------|--------------------------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/eval/flolpips/pwcnet.py | http://content.sniklaus.com/github/pytorch-pwc/network- | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/eval/fvd/styleganv/fvd.py | https://www.dropbox.com/s/ge9e5ujwgetktms/i3d_torchscript.pt | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/eval/fvd/videogpt/fvd.py | https://onedrive.live.com/download?cid=78EEF3EB6AE7DBCB&resid=78EEF3EB6AE7DBCB%21199&authkey=AApKdFHPXzWLNyI | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/models/frame_interpolation/utils/flow_utils.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/serve/gradio_utils.py | https://www.pnglog.com/AOuPMh.png | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/train/train_t2v.py | https://www.tensorflow.org/tensorboard | tensorboard地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/train/train_t2v.py | https://arxiv.org/abs/2303.09556. | SNR weighting gamma论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/train/train_t2v.py | https://pytorch.org/docs/stable/notes/cuda.html#tensorfloat-32-tf32-on-ampere-devices | 设置说明 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/OpenSoraPlan1.1/opensora/utils/taming_download.py | https://heibox.uni-heidelberg.de/f/607503859c864bc1b30b/?dl=1 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/PLLaVA/public_address_statement.md b/PyTorch/built-in/mm/PLLaVA/public_address_statement.md index 659095be2760ed204ef5a3d3231758880d499de6..6b777b428ba1ce1c75c2ad5ba859292d1cb1e0ac 100644 --- a/PyTorch/built-in/mm/PLLaVA/public_address_statement.md +++ b/PyTorch/built-in/mm/PLLaVA/public_address_statement.md @@ -1,10 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------- |-------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------|------------------------|-----------------------------| -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/utils/logger.py | pllava/utils/logger.py | https://github.com/facebookresearch/mmf/blob/master/mmf/utils/logger.py | 代码实现参考连接 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/dataset/video_utils.py | pllava/dataset/video_utils.py | https://github.com/m-bain/frozen-in-time/blob/22a91d78405ec6032fdf521ae1ff5573358e632f/base/base_dataset.py | 代码实现参考连接 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/dataset/video_utils.py | pllava/dataset/video_utils.py | https://github.com/facebookresearch/pytorchvideo/blob/main/pytorchvideo/data/utils.py#L54-L64 | 代码实现参考连接 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/models/pllava/configuration_pllava.py | pllava/models/pllava/configuration_pllava.py | https://huggingface.co/llava-hf/llava-v1.5-7b/resolve/main/config.json | 数据集下载 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/models/pllava/configuration_pllava.py | pllava/models/pllava/configuration_pllava.py | https://huggingface.co/llava-hf/llava-9b | 数据集下载 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/models/pllava/modeling_pllava.py | pllava/models/pllava/modeling_pllava.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 代码实现参考连接 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/models/pllava/modeling_pllava.py | pllava/models/pllava/modeling_pllava.py | https://arxiv.org/abs/1910.13461 | 代码实现参考连接 | -| 开源代码引入 | https://github.com/magic-research/PLLaVA/tree/main/utils/scheduler.py | pllava/utils/scheduler.py | https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/optimization.py | 代码实现参考连接 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/PLLaVA/models/pllava/modeling_pllava.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/PLLaVA/models/pllava/modeling_pllava.py | https://arxiv.org/abs/1910.13461 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/Qwen-VL/public_address_statement.md b/PyTorch/built-in/mm/Qwen-VL/public_address_statement.md index b2a7a46fd8b0c5163e4ff8ccccfb56ed2b83f3a2..c2c84ffafed2604a857dba34e0ca3fc295fa6ecd 100644 --- a/PyTorch/built-in/mm/Qwen-VL/public_address_statement.md +++ b/PyTorch/built-in/mm/Qwen-VL/public_address_statement.md @@ -1,22 +1,6 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:-------------------------:|:---------------------------------------------------------------------------------------------:|:--------------------:|:-----------------:| -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/vqa_eval.py | ./eval_mm/vqa_eval.py | https://opensource.org/licenses/BSD-3-Clause | Modified BSD License | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/vqa_eval.py | ./eval_mm/vqa_eval.py | https://github.com/tylin/coco-caption/blob/master/pycocoevalcap/eval.py | written by Tsung-Yi Lin for MSCOCO Python API | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/web_demo_mm.py | ./web_demo_mm.py | https://modelscope.cn/api/v1/models/qwen/Qwen-7B-Chat/repo | Qwen-7B-Chat modelscope地址 | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/web_demo_mm.py | ./web_demo_mm.py | https://modelscope.cn/models/qwen/Qwen-VL/summary | Qwen-VL modelscope地址 | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/web_demo_mm.py | ./web_demo_mm.py | https://huggingface.co/Qwen/Qwen-VL | Qwen-VL modelscope地址 | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/web_demo_mm.py | ./web_demo_mm.py | https://modelscope.cn/models/qwen/Qwen-VL-Chat/summary | Qwen-VL-CHat modelscope地址 | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/web_demo_mm.py | ./web_demo_mm.py | https://huggingface.co/Qwen/Qwen-VL-Chat | Qwen-VL-Chat huggingface地址 | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/web_demo_mm.py | ./web_demo_mm.py | https://github.com/QwenLM/Qwen-VL | QwenLM 官方仓库地址 | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/seed_bench/trans.py | ./eval_mm/seed_bench/trans.py | https://huggingface.co/datasets/AILab-CVC/SEED-Bench/blob/main/SEED-Bench.json | a large-scale benchmark | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/seed_bench/trans.py | ./eval_mm/seed_bench/trans.py | https://github.com/AILab-CVC/SEED-Bench/blob/main/DATASET.md | readme for a large-scale benchmark | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/vqa.py | ./eval_mm/vqa.py | https://github.com/pdollar/coco/blob/master/PythonAPI/pycocotools/coco.py | written by Tsung-Yi Lin for MSCOCO Python API | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/vqa.py | ./eval_mm/vqa.py | https://opensource.org/licenses/BSD-3-Clause | Modified BSD License | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/evaluate_vqa.py | ./eval_mm/evaluate_vqa.py | https://github.com/google-research/pix2struct/blob/main/pix2struct/metrics.py#L81 | source code of google-research/pix2struct | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/evaluate_vqa.py | ./eval_mm/evaluate_vqa.py | https://arxiv.org/pdf/2203.10244.pdf | ChartQA: A Benchmark for Question Answering about Charts with Visual and Logical Reasoning | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/infographicsvqa_eval.py | ./eval_mm/infographicsvqa_eval.py | https://www.docvqa.org/datasets/infographicvqa | InfographicVQA dataset (2021 Challenge, task 3 dataset) | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/eval_mm/infographicsvqa_eval.py | ./eval_mm/infographicsvqa_eval.py | https://rrc.cvc.uab.es/?ch=17&com=introduction | Overview - Document Visual Question Answerin | -| 开源代码引入 | https://github.com/QwenLM/Qwen-VL/blob/master/openai_api.py | ./openai_api.py | https://platform.openai.com/docs/api-reference/chat | openAI api reference | -| 开源代码引入 | https://www.modelscope.cn/models/qwen/Qwen-VL-Chat/file/view/master?fileName=visual.py&status=1 | ./models/visual.py | https://github.com/facebookresearch/mae/blob/efb2a8062c206524e35e47d04501ed4f544c0ae8/util/pos_embed.py#L20 | source code of facebookresearch/mae | -|开源代码引入| https://github.com/QwenLM/Qwen-VL/blob/master/README.md | ./infer.py | https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg | demo image for inference | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------|--------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/Qwen-VL/infer.py | https://qianwen-res.oss-cn-beijing.aliyuncs.com/Qwen-VL/assets/demo.jpeg | 图片示例 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Qwen-VL/web_demo_mm.py | https://modelscope.cn/models/qwen/Qwen-VL-Chat/summary | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Qwen-VL/web_demo_mm.py | https://modelscope.cn/api/v1/models/qwen/Qwen-7B-Chat/repo? | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/mm/Qwen-VL/web_demo_mm.py | https://modelscope.cn/models/qwen/Qwen-VL/summary | 模型地址 | \ No newline at end of file diff --git a/PyTorch/built-in/mm/VisualGLM/public_address_statement.md b/PyTorch/built-in/mm/VisualGLM/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..e933e84d3746e3b0c53a2b5807ce936af4a5a15b --- /dev/null +++ b/PyTorch/built-in/mm/VisualGLM/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------|----------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/mm/VisualGLM/model/visualglm.py | https://cloud.tsinghua.edu.cn/d/dd80f9d39d454bc29ce4/files/?p=%2Fvisualglm-6b.zip&dl=1 | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/public_address_statement.md index 0953e1185464a4829ae46b3025729c4810094b9a..13f40835c1d953842b8b720dd88cd9d076b63272 100644 --- a/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/public_address_statement.md @@ -1,136 +1,22 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sentence_classfication/Sohu_2022_ABSA/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/Sohu_2022_ABSA/top1/training.py | https://zhuanlan.zhihu.com/p/533808475 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sentence_classfication/Sohu_2022_ABSA/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/Sohu_2022_ABSA/top1/training_bert.py | https://zhuanlan.zhihu.com/p/533808475 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/serving/sanic_server/src/utils/configs.py | Bert-CRF_for_PyTorch/examples/serving/sanic_server/src/utils/configs.py | https://stackoverflow.com/a/52187065/3429596 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/activations.py | Bert-CRF_for_PyTorch/bert4torch/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/activations.py | Bert-CRF_for_PyTorch/bert4torch/activations.py | https://arxiv.org/abs/1702.03118 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/activations.py | Bert-CRF_for_PyTorch/bert4torch/activations.py | https://arxiv.org/abs/1710.05941v1 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/activations.py | Bert-CRF_for_PyTorch/bert4torch/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/activations.py | Bert-CRF_for_PyTorch/bert4torch/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/losses.py | Bert-CRF_for_PyTorch/bert4torch/losses.py | https://kexue.fm/archives/7359 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/losses.py | Bert-CRF_for_PyTorch/bert4torch/losses.py | https://www.sbert.net/docs/package_reference/losses.html | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/bert4torch/losses.py | https://github.com/dropreg/R-Drop | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/bert4torch/losses.py | https://arxiv.org/abs/1904.12848 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/attention.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://arxiv.org/abs/2202.10447 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/bert4torch/losses.py | https://github.com/s-laine/tempens | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/bert4torch/losses.py | https://github.com/ferretj/temporal-ensembling | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/attention.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://kexue.fm/archives/8823 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/transformer_block.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/modeling_t5.py | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/models/base.py | Bert-CRF_for_PyTorch/bert4torch/models.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/models/base.py | Bert-CRF_for_PyTorch/bert4torch/models.py | https://arxiv.org/abs/1905.03197 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/position_encoding.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/crf.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://github.com/lonePatient/BERT-NER-Pytorch/blob/master/models/layers/crf.py | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/models/bert.py | Bert-CRF_for_PyTorch/bert4torch/models.py | https://github.com/huggingface/transformers/issues/3936 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/models/bert.py | Bert-CRF_for_PyTorch/bert4torch/models.py | https://github.com/huggingface/transformers/issues/4189 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/tokenizers.py | Bert-CRF_for_PyTorch/bert4torch/tokenizers.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/position_encoding.py | Bert-CRF_for_PyTorch/bert4torch/models.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/global_point.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://kexue.fm/archives/8373 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/global_point.py | Bert-CRF_for_PyTorch/bert4torch/layers.py | https://kexue.fm/archives/8877 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/bert4torch/models.py | https://github.com/PaddlePaddle/ERNIE | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/callbacks.py | Bert-CRF_for_PyTorch/bert4torch/snippets.py | https://github.com/namisan/mt-dnn/blob/v0.2/alum/adv_masked_lm.py | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/serving/elasticsearch/README.md | Bert-CRF_for_PyTorch/bert4torch/snippets.py | http://127.0.0.1 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/bert4torch/models.py | https://github.com/imcaspar/gpt2-ml | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/models/transformer_xl.py | Bert-CRF_for_PyTorch/bert4torch/models.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/basic/basic_language_model_CDial_GPT.py | https://github.com/thu-coai/CDial-GPT | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/basic/basic_language_model_cpm_lm.py | https://github.com/TsinghuaAI/CPM-Generate | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/basic/basic_language_model_gpt2_ml.py | https://github.com/imcaspar/gpt2-ml | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/basic/basic_language_model_nezha_gen_gpt.py | Bert-CRF_for_PyTorch/examples/basic/basic_language_model_nezha_gen_gpt.py | https://pan.baidu.com/s/1-FB0yl1uxYDCGIRvU1XNzQ | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/basic/basic_language_model_simbert.py | https://github.com/ZhuiyiTechnology/simbert | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/basic/basic_language_model_simbert.py | https://github.com/ZhuiyiTechnology/roformer-sim | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_bart_fudanNLP.py | https://github.com/fastnlp/CPT | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_bert-base-chinese.py | https://huggingface.co/bert-base-chinese | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_bert-base-chinese.py | https://github.com/google-research/bert | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_bert-base-chinese.py | https://huggingface.co/docs/transformers/converting_tensorflow_models | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_GAU_alpha.py | https://github.com/ZhuiyiTechnology/GAU-alpha | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_gpt2__cmp_lm_2.6b.py | https://github.com/TsinghuaAI/CPM-Generate | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/convert_script/convert_gpt2_cmp_lm_2.6b.py | Bert-CRF_for_PyTorch/examples/convert_script/convert_gpt2__cmp_lm_2.6b.py | https://huggingface.co/TsinghuaAI/CPM-Generate | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_gpt2__gpt2-ml.py | https://github.com/imcaspar/gpt2-ml | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_gpt2__gpt2-ml.py | https://github.com/ghosthamlet/gpt2-ml-torch | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_gpt__CDial-GPT-LCCC.py | https://github.com/thu-coai/CDial-GPT | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_roberta_chess.py | https://kexue.fm/archives/7877 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/convert_script/convert_t5_pegasus.py | https://github.com/ZhuiyiTechnology/t5-pegasus | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/others/task_conditional_language_model.py | Bert-CRF_for_PyTorch/examples/others/task_conditional_language_model.py | https://kexue.fm/archives/7124 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/others/task_iflytek_bert_of_theseus.py | https://kexue.fm/archives/7575 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/others/task_language_model_chinese_chess.py | https://kexue.fm/archives/7877 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/others/task_iflytek_bert_of_theseus.py | Bert-CRF_for_PyTorch/examples/others/task_iflytek_bert_of_theseus.py | https://www.cluebenchmarks.com | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/others/task_nl2sql_baseline.py | Bert-CRF_for_PyTorch/examples/others/task_nl2sql_baseline.py | https://kexue.fm/archives/6771 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_CasRel.py | https://kexue.fm/archives/7161 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_CasRel.py | http://ai.baidu.com/broad/download?dataset=sked | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_gplinker.py | https://kexue.fm/archives/8888 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_gplinker.py | http://ai.baidu.com/broad/download?dataset=sked | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_tplinker.py | https://github.com/131250208/TPlinker-joint-extraction | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_tplinker.py | http://ai.baidu.com/broad/download?dataset=sked | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_tplinker_plus.py | https://github.com/131250208/TPlinker-joint-extraction | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/relation_extraction/task_relation_extraction_tplinker_plus.py | http://ai.baidu.com/broad/download?dataset=sked | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sentence_classfication/task_sentiment_classification_albert_lion.py | Bert-CRF_for_PyTorch/examples/sentence_classfication/task_sentiment_classification_albert.py | https://github.com/brightmart/albert_zh | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/task_sentiment_classification_PET.py | https://github.com/bojone/Pattern-Exploiting-Training | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/task_sentiment_classification_P_tuning.py | https://github.com/THUDM/P-tuning | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/task_sentiment_classification_P_tuning.py | https://github.com/bojone/P-tuning | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/task_sentiment_classification_roformer.py | https://github.com/ZhuiyiTechnology/roformer | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/task_sentiment_classification_roformer_v2.py | https://github.com/ZhuiyiTechnology/roformer-v2 | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_embedding/task_sentence_embedding_sup_CoSENT.py | https://kexue.fm/archives/8847 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_embedding/task_sentence_embedding_unsup_bert_whitening.py | https://github.com/bojone/BERT-whitening | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sentence_embedding/task_sentence_embedding_unsup_DiffCSE.py | Bert-CRF_for_PyTorch/examples/sentence_embedding/task_sentence_embedding_unsup_DiffCSE.py | https://github.com/voidism/DiffCSE | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_embedding/task_sentence_embedding_unsup_PromptBert.py | https://github.com/kongds/Prompt-BERT | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/sentence_embedding/task_sentence_embedding_unsup_SimCSE.py | https://kexue.fm/archives/8348 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_kgclue_seq2seq.py | https://kexue.fm/archives/8802 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/seq2seq/task_question_answer_generation_by_seq2seq.py | Bert-CRF_for_PyTorch/examples/seq2seq/task_question_answer_generation_by_seq2seq.py | https://github.com/bojone/dgcnn_for_reading_comprehension | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/seq2seq/task_question_answer_generation_by_seq2seq.py | Bert-CRF_for_PyTorch/examples/seq2seq/task_reading_comprehension_by_mlm.py | https://github.com/bojone/dgcnn_for_reading_comprehension | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/seq2seq/task_question_answer_generation_by_seq2seq.py | Bert-CRF_for_PyTorch/examples/seq2seq/task_reading_comprehension_by_seq2seq.py | https://github.com/bojone/dgcnn_for_reading_comprehension | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_ape210k_math_word_problem.py | https://kexue.fm/archives/7809 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_autotitle.py | https://kexue.fm/archives/6933 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/others/task_iflytek_bert_of_theseus.py | Bert-CRF_for_PyTorch/examples/seq2seq/task_kgclue_seq2seq.py | https://www.cluebenchmarks.com | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_autotitle_csl_mt5.py | https://github.com/CLUEbenchmark/CLGE | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/seq2seq/task_seq2seq_autotitle_csl_mt5.py | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_autotitle_csl_mt5.py | https://github.com/bojone/t5_in_bert4keras | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_autotitle_csl_uer_t5.py | https://github.com/CLUEbenchmark/CLGE | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_autotitle_csl_unilm.py | https://kexue.fm/archives/6933 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_autotitle_csl_unilm.py | https://github.com/CLUEbenchmark/CLGE | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/seq2seq/task_seq2seq_simbert.py | https://github.com/ZhuiyiTechnology/simbert | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_cascade_crf.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_crf_add_posseg.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_crf_freeze.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_efficient_global_pointer.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/global_point.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_efficient_global_pointer.py | https://kexue.fm/archives/8373 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_global_pointer.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/bert4torch/layers/global_point.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_global_pointer.py | https://kexue.fm/archives/8373 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_mrc.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_span.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_W2NER.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_W2NER.py | https://github.com/ljynlp/W2NER | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/task_sequence_labeling_ner_crf.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/task_sequence_labeling_ner_W2NER.py | http://s3.bmio.net/kashgari/china-people-daily-ner-corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/serving/basic_simple_web_serving_simbert.py | Bert-CRF_for_PyTorch/examples/serving/basic_simple_web_serving_simbert.py | https://github.com/bojone/bert4keras/blob/8ffb46a16a79f87aa8cdf045df7994036b4be47d/bert4keras/snippets.py#L580 | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/serving/elasticsearch/README.md | Bert-CRF_for_PyTorch/examples/serving/basic_simple_web_serving_simbert.py | http://127.0.0.1 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/training_trick/task_sentiment_adversarial_training.py | Bert-CRF_for_PyTorch/examples/training_trick/task_sentiment_adversarial_training.py | https://kexue.fm/archives/7234 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/training_trick/task_sentiment_adversarial_training.py | Bert-CRF_for_PyTorch/examples/training_trick/task_sentiment_adversarial_training.py | https://kexue.fm/archives/7466 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/training_trick/task_sentiment_R-Drop.py | https://github.com/dropreg/R-Drop | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/training_trick/task_sentiment_TemporalEnsembling.py | https://github.com/s-laine/tempens | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/training_trick/task_sentiment_TemporalEnsembling.py | https://github.com/ferretj/temporal-ensembling | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/training_trick/task_sentiment_UDA.py | https://arxiv.org/abs/1904.12848 | 参考论文地址 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/pretrain/roberta_pretrain/pretrain_roberta_mlm.py | https://github.com/Tongjilibo/bert4torch/blob/master/examples/training_trick/task_distributed_data_parallel.py | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/pretrain/simbert_v2_pretrain/simbert_v2_stage1.py | https://github.com/ZhuiyiTechnology/roformer-sim | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/pretrain/simbert_v2_pretrain/simbert_v2_stage2.py | https://github.com/ZhuiyiTechnology/roformer-sim | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/README.md | Bert-CRF_for_PyTorch/examples/pretrain/simbert_v2_pretrain/simbert_v2_supervised.py | https://github.com/ZhuiyiTechnology/roformer-sim | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sentence_classfication/Tianchi_News_Classification/README.md | Bert-CRF_for_PyTorch/examples/sentence_classfication/Tianchi_News_Classification/training.py | https://github.com/kangyishuai/NEWS-TEXT-CLASSIFICATION | 源码实现 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base_v0.1/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/model_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/special_tokens_map.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_medium_v1.0/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_medium/model_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_mini_v1.0/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_mini/model_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_micro_v1.0/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_micro/model_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_nano_v1.0/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_nano/model_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_medical_base_v0.1/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny_v0.1/model_state.pdparams | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/model_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/special_tokens_map.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/convert.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/serving/sanic_server/client.py | Bert-CRF_for_PyTorch/examples/serving/sanic_server/client.py | http://localhost:8082/recommendinfo | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/sequence_labeling/uie/utils.py | Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/utils.py | https://blog.csdn.net/blmoistawinde/article/details/82379256 | 模型相关说明 | -| 开源代码引入 | https://github.com/Tongjilibo/bert4torch/blob/master/examples/others/task_language_model_chinese_chess.py | Bert-CRF_for_PyTorch/setup.py | https://github.com/Tongjilibo/bert4torch | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/model_config.json | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/special_tokens_map.json | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/tokenizer_config.json | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base/vocab.txt | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_base_v0.1/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_medical_base_v0.1/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_medium/model_config.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_medium_v1.0/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_micro/model_config.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_micro_v1.0/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_mini/model_config.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_mini_v1.0/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_nano/model_config.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_nano_v1.0/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/model_config.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/special_tokens_map.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/tokenizer_config.json | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny/vocab.txt | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/convert.py | https://bj.bcebos.com/paddlenlp/taskflow/information_extraction/uie_tiny_v0.1/model_state.pdparams | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-CRF_for_PyTorch/examples/sequence_labeling/uie/utils.py | https://blog.csdn.net/blmoistawinde/article/details/82379256 | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/public_address_statement.md index e44c6c5c0e71f05e737f7c3b0691ea720cd11da3..c096deea3b52ef5114385b09ab830ea9fd615e54 100644 --- a/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/public_address_statement.md @@ -1,56 +1,39 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/triton/Dockerfile | Bert-Squad_ID0470_for_PyTorch/Dockerfile |https://github.com/attardi/wikiextractor.git|attardi_wikiextractor在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/triton/Dockerfile |Bert-Squad_ID0470_for_PyTorch/Dockerfile |https://github.com/soskek/bookcorpus.git|bookcorpus在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/requirements.txt |Bert-Squad_ID0470_for_PyTorch/Dockerfile |https://github.com/NVIDIA/dllogger|NVIDIA_dllogger在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz|'bert-base-uncased'模型在开源社区上的bert-base-uncased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz|'bert-large-uncased'模型在开源社区上的bert-large-uncased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz|'bert-base-cased'模型在开源社区上的bert-base-cased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz|'bert-large-cased'模型在开源社区上的bert-large-cased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz|'bert-base-multilingual-uncased'模型在开源社区上的bert-base-multilingual-uncased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz|'bert-base-multilingual-cased'模型在开源社区上的bert-base-multilingual-cased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py|Bert-Squad_ID0470_for_PyTorch/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz|'bert-base-chinese'模型在开源社区上的bert-base-chinese.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt|'bert-base-uncased'模型在开源社区上的bert-base-uncased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt|'bert-large-uncased'模型在开源社区上的bert-large-uncased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt|'bert-base-cased'模型在开源社区上的bert-base-cased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt|'bert-large-cased'模型在开源社区上的bert-large-cased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt|'bert-base-multilingual-uncased'模型在开源社区上的bert-base-multilingual-uncased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt|'bert-base-multilingual-cased'模型在开源社区上的bert-base-multilingual-cased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |Bert-Squad_ID0470_for_PyTorch/tokenization.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt|'bert-base-chinese'模型在开源社区上的bert-base-chinese-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip |'bert_base_uncased'模型在开源社区上的uncased_L-12_H-768_A-12.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-24_H-1024_A-16.zip |'bert_large_uncased'模型在开源社区上的uncased_L-24_H-1024_A-16.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip |'bert_base_cased'模型在开源社区上的cased_L-12_H-768_A-12.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_10_18/cased_L-24_H-1024_A-16.zip |'bert_large_cased'模型在开源社区上的cased_L-24_H-1024_A-16.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_11_23/multi_cased_L-12_H-768_A-12.zip |'bert_base_multilingual_cased'模型在开源社区上的multi_cased_L-12_H-768_A-12.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_11_03/multilingual_L-12_H-768_A-12.zip |'bert_large_multilingual_uncased'模型在开源社区上的multilingual_L-12_H-768_A-12.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/GooglePretrainedWeightDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py |https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip |'bert_base_chinese'模型在开源社区上的chinese_L-12_H-768_A-12.zip下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/SquadDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py |https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v1.1.json |SQuAD模型在开源社区上的train-v1.1.json的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/SquadDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py |https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v1.1.json |SQuAD模型在开源社区上的dev-v1.1.json的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/SquadDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py |https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob/|SQuAD模型在开源社区上的evaluate-v1.1.py的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/SquadDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py |https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v2.0.json |SQuAD模型在开源社区上的train-v2.0.json的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/SquadDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py |https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v2.0.json |SQuAD模型在开源社区上的dev-v2.0.json的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/SquadDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py |https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/'|SQuAD模型在开源社区上的evaluate-v2.0.py的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/WikiDownloader.py | Bert-Squad_ID0470_for_PyTorch/data/WikiDownloader.py |https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2|enwiki在开源社区上的enwiki-latest-pages-articles.xml.bz2的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/data/WikiDownloader.py |Bert-Squad_ID0470_for_PyTorch/data/WikiDownloader.py |https://dumps.wikimedia.org/zhwiki/latest/zhwiki-latest-pages-articles.xml.bz2|zhwiki在开源社区上的zhwiki-latest-pages-articles.xml.bz2的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/transformers/docs/source/en/main_classes/processors.mdx |Bert-Squad_ID0470_for_PyTorch/data/squad/squad_download.sh | https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v1.1.json | SQuAD_train-v1.1在开源社区上的json下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/transformers/docs/source/en/main_classes/processors.mdx |Bert-Squad_ID0470_for_PyTorch/data/squad/squad_download.sh | https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v1.1.json | SQuAD_dev-v1.1在开源社区上的json下载链接| -| 开发引入 | / |Bert-Squad_ID0470_for_PyTorch/url.ini | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents |SQuAD_evaluate-v1.1在开源社区上的py下载链接| -| 开发引入 | / |Bert-Squad_ID0470_for_PyTorch/url.ini | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents |SQuAD_evaluate-v2.0在开源社区上的py下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/transformers/docs/source/en/main_classes/processors.mdx |Bert-Squad_ID0470_for_PyTorch/data/squad/squad_download.sh | https://rajpurkar.github.io/SQuAD-explorer/dataset/train-v2.0.json | SQuAD_train-v2.0在开源社区上的json下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/transformers/docs/source/en/main_classes/processors.mdx |Bert-Squad_ID0470_for_PyTorch/data/squad/squad_download.sh | https://rajpurkar.github.io/SQuAD-explorer/dataset/dev-v2.0.json | SQuAD_dev-v2.0在开源社区上的json下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/transformers/docs/source/en/main_classes/processors.mdx |Bert-Squad_ID0470_for_PyTorch/data/squad/squad_download.sh | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob |SQuAD_evaluate-v2.0在开源社区上的py下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/model_doc/jukebox.mdx |Bert-Squad_ID0470_for_PyTorch/data/GLUEDownloader.py | https://gist.githubusercontent.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e/raw/17b8dd0d724281ed7c3b2aeeda662b92809aadd5/download_glue_data.py | SQuAD_download_glue_data在开源社区上的py下载链接| -| 开发引入 | / |Bert-Squad_ID0470_for_PyTorch/url.ini | https://github.com/rowanz/swagaf.git | SWAG在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py|Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/run_swag.py|Bert-Squad_ID0470_for_PyTorch/run_swag.py | https://github.com/google-research/bert/issues/38 | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/run_glue.py|Bert-Squad_ID0470_for_PyTorch/run_swag.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/DGLPyTorch/DrugDiscovery/SE3Transformer/README.md|Bert-Squad_ID0470_for_PyTorch/run_swag.py | https://docs.nvidia.com/deeplearning/sdk/mixed-precision-training/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/run_glue.py|Bert-Squad_ID0470_for_PyTorch/run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/run_glue.py|Bert-Squad_ID0470_for_PyTorch/run_glue.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/run_glue.py|Bert-Squad_ID0470_for_PyTorch/run_glue.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/PyTorch/LanguageModeling/BART/bart/modeling/modeling_t5.py|Bert-Squad_ID0470_for_PyTorch/modeling.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/run_glue.py|Bert-Squad_ID0470_for_PyTorch/modeling.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py|Bert-Squad_ID0470_for_PyTorch/modeling.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/PyTorch/Detection/Efficientdet/utils/scheduler.py|Bert-Squad_ID0470_for_PyTorch/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/PaddlePaddle/LanguageModeling/BERT/data/squad/squad_download.sh|Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/ | 模型相关说明 | -| 开发引入 | / |Bert-Squad_ID0470_for_PyTorch/requirements.txt | requirements.txt | 相关依赖 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GLUEDownloader.py | https://gist.githubusercontent.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e/raw/17b8dd0d724281ed7c3b2aeeda662b92809aadd5/download_glue_data.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-24_H-1024_A-16.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_03/multilingual_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/cased_L-24_H-1024_A-16.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_23/multi_cased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/WikiDownloader.py | https://dumps.wikimedia.org/zhwiki/latest/zhwiki-latest-pages-articles.xml.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/data/WikiDownloader.py | https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s4.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/modeling_for_a1.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/url.ini | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-Squad_ID0470_for_PyTorch/url.ini | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/public_address_statement.md index 5f13fd9bfe69cda234e9844c320bae0c88f8a2cd..6e44cd115cea38ad00ed23f1928efa3321162e2c 100644 --- a/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/public_address_statement.md @@ -1,2878 +1,358 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-base-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-large-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xlarge-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-base-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-large-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xlarge-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-base-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-large-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-base-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-large-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/configuration_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://huggingface.co/vinai/bartpho-syllable/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://huggingface.co/vinai/bartpho-syllable/resolve/main/dict.txt | 下载词表文件 | -| 开发引入 | / | url.ini | https://huggingface.co/microsoft/beit-base-patch16-224-in22k/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开发引入 | / | url.ini | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-chinese/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-chinese/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://huggingface.co/vinai/bertweet-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://huggingface.co/vinai/bertweet-base/resolve/main/bpe.codes | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_generation/tokenization_bert_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://huggingface.co/google/bert_for_seq_generation_L-24_bbc_encoder/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-pubmed/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-bigpatent/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/configuration_blenderbot.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/camembert-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/Musixmatch/umberto-wikipedia-uncased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/camembert-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/configuration_canine.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/configuration_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/configuration_ctrl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py | https://huggingface.co/ctrl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_audio.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/data2vec/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/spm.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/spm.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/spm.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/spm.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/configuration_deit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-patch16-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/configuration_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-base-generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-large-generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-generator/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-generator/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-generator/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/configuration_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/configuration_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-src.json | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-tgt.json | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gptj/configuration_gptj.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neo/configuration_gpt_neo.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/configuration_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/configuration_layoutlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/configuration_layoutlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/configuration_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/configuration_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/configuration_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/entity_vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/entity_vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/configuration_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/configuration_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 下载词汇编码表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://cdn-datasets.huggingface.co/language_codes/iso-639-3.csv | 下载词汇编码表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://object.pouta.csc.fi/Tatoeba-MT-models | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/{repo_root}/tree/master/models/{opus_name}/README.md | 下载说明文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/source.spm | 下载spm文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/target.spm | 下载spm文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/configuration_maskformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/maskformer-swin-base-ade/blob/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/configuration_mbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/entity_vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/configuration_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/configuration_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/configuration_mpnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/configuration_openai.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/configuration_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/configuration_perceiver.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/bpe.codes | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/bpe.codes | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/configuration_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-c-cpp-defect-detection/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-cs-java/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-en_XX-java/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-go-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-clone-detection/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-cs/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-javascript-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-php-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-python-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-medium/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-small/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-ruby-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/configuration_poolformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/prophetnet/configuration_prophetnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/prophetnet/tokenization_prophetnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/prophetnet.tokenizer | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qdqbert/configuration_qdqbert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rag/retrieval_rag.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/tokenizer.jsont | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/configuration_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/configuration_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-enwik8/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/configuration_rembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/google/rembert/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/resnet/configuration_resnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50/blob/main/config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/retribert/tokenization_retribert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/retribert/tokenization_retribert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/retribert/tokenization_retribert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/configuration_segformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew/configuration_sew.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/configuration_sew_d.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/configuration_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/configuration_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-3b/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-11b/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/configuration_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/vocab.pkl | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/corpus.bin | 下载权重文件 | -| 开发引入 | / | url.ini | https://huggingface.co/microsoft/trocr-base/resolve/main/config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/facebook/unispeech-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/facebook/unispeech_sat-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/configuration_van.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/Visual-Attention-Network/van-base/blob/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Tiny-original/resolve/main/van_tiny_754.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Small-original/resolve/main/van_small_811.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Base-original/resolve/main/van_base_828.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Large-original/resolve/main/van_large_839.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/configuration_vilt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm/blob/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://github.com/dandelin/ViLT/releases/download/200k/vilt_200k_mlm_itm.ckpt | 下载权重文件 | -| 开发引入 | / | url.ini | https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg | 下载数据集图片 | -| 开发引入 | / | url.ini | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 下载数据集图片 | -| 开发引入 | / | url.ini | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-coco-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-coco-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/configuration_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/vit-base-patch16-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/convert_dino_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/configuration_vit_mae.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/configuration_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/facebook/wavlm-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/configuration_xglm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/prophetnet.tokenizer | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xxl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yoso/configuration_yoso.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096/resolve/main/config.json | 下载预训练配置文件 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/GoogleCloudPlatform/ml-testing-accelerators.git|ACC函数中对url的设置| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开源代码引入| https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | config.yml中对usr.email的配置选项| -| 开源代码引入| https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | config.yml中对usr.email的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml |https://github.com/facebookresearch/detectron2.git |detectron2模型在开源社区中的源码链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|Bert-text-classification_for_PyTorch/transformers/CITATION.cff |https://github.com/huggingface/transformers |CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_zh-hant.md |Bert-text-classification_for_PyTorch/transformers/CITATION.cff |https://www.aclweb.org/anthology/2020.emnlp-demos.6 |CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py |Bert-text-classification_for_PyTorch/transformers/setup.py |https://github.com/huggingface/transformers |setuptools中对transformers的配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py |Bert-text-classification_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers的git链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|Bert-text-classification_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py |Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_repo.py |Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/doc-builder |Dockerfile文件中transformers的git链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py|Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py|Bert-text-classification_for_PyTorch/transformers/docker/transformers-gpu/Dockerfile |https://github.com/NVIDIA/apex|Dockerfile文件中apex的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/deepspeed/test_deepspeed.py|Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/microsoft/DeepSpeed|Dockerfile文件中apex在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/facebookresearch/detectron2.git |kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docker/transformers-pytorch-tpu/Dockerfile|Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh |Dockerfile文件中miniconda在开源社区中的的sh链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://github.com/huggingface/transformers.git|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|Bert-text-classification_for_PyTorch/transformers/docker/transformers-tensorflow-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt|Bert-text-classification_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt|Bert-text-classification_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/valid.txt|CONLL2003数据集在开源社区上的valid.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/test.txt|CONLL2003数据集在开源社区上的test.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/train.txt|CONLL2003数据集在开源社区上的train.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertabs/configuration_bertabs.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertabs/configuration_bertabs.py |https://huggingface.co/remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization/resolve/main/config.json |bertabs-finetuned-cnndm模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/fsner/setup.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | author_email="thomas@huggingface.co" |setuptools的author_email配置选项| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://github.com/huggingface/transformers/tree/master/examples/research_projects/fsner |setuptools的url配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/trainer/test_trainer.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/fsner/setup.py |https://github.com/huggingface/transformers/issues |setuptools的Bug Tracker在开源社区中的链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://huggingface.co/front/assets/huggingface_logo.svg|获取huggingface开源社区header_html| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://en.wikipedia.org/wiki |获取wiki_url开源社区链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt |https://github.com/huggingface/transformers.git |requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://cdn.huggingface.co |获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert |获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/requirements.txt |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt |https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers |requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://github.com/Tongjilibo/bert4torch/blob/master//pplm/bow/military.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh |https://cdn-datasets.huggingface.co/summarization/cnn_tiny.tgz|cnn_tiny数据集在开源社区中的tgz链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt |https://github.com/huggingface/transformers.git|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开发引入 | / | Bert-text-classification_for_PyTorch/url.ini | author_email="thomas@huggingface.co" | setuptools的author_email配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1x_G2cjvM1nW5hjAB8-vWxRqtQTlmIaQU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1oA2aqZlVNj5FarxBlNXEHpBS4lRetTzU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1Wup2D318QYBFPW_NKI1mfP_hXOfmUI9r|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1mNufoynJ9-Zy1kJh2TA_lHm2squji0i9|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1iO7um-HWoNoRKDtw27YUSgyeubn9uXqj|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1j6z9fYdlUyOYsh7KJoumRlr1yHczxR5T|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1yT7ZjqfvUYOBXvMjeY8uGRHQFWoSo8Q5|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=15gAzHeRUCs-QV8vHeTReMPEh1j8excNE|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1907?run_id=6937 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1914?run_id=6724 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1909?run_id=6862 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1902?run_id=6750 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |https://github.com/huggingface/transformers|wmt19数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/tatoeba/upload_models.sh|Bert-text-classification_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh |https://huggingface.co/Helsinki-NLP |Helsinki-NLP在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开发引入 | / |Bert-text-classification_for_PyTorch/url.ini |https://moon-staging.huggingface.co|moon-staging默认开源社区url链接配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/commands/add_new_model_like.py|Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py |https://huggingface.co/{new_model_patterns.checkpoint}/resolve/main/config.json|使用json添加新model开源社区链接引用| -| 开发引入 | / |Bert-text-classification_for_PyTorch/url.ini |https://huggingface.co/api/models|获取huggingface接口开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/config.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py|Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/clip/test_modeling_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/clip/test_modeling_tf_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg |图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg |图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开发引入 | / |Bert-text-classification_for_PyTorch/url.ini |https://bogus |bogus音视频开源下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx|Bert-text-classification_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/master/|model_list检查开源社区url链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py|Bert-text-classification_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/|model_list检查开源社区url链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4|"CoLA"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8|"SST"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc|"MRPC"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5|"QQP"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5|"STS"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce|"MNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df|"SNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601|"QNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb|"RTE"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf|"WNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://storage.googleapis.com/mtl-sentence-representations.appspot.com |"diagnostic"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt |MRPC任务训练集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt |MRPC任务测试集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service_deprecated.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://api.github.com/repos/huggingface/transformers/actions/runs0 |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|Bert-text-classification_for_PyTorch/transformers/utils/notification_service.py |https://api.github.com/repos/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx |Bert-text-classification_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/master/model_doc |transformers_model_doc开源社区url连接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py|Bert-text-classification_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/model_doc |transformers_model_doc开源社区url连接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_table.py | Bert-text-classification_for_PyTorch/transformers/utils/update_metadata.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/README.md | Bert-text-classification_for_PyTorch/transformers/utils/update_metadata.py | https://github.com/huggingface/transformers/commit/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | Bert-text-classification_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | Bert-text-classification_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | Bert-text-classification_for_PyTorch/transformers/utils/notification_service_deprecated.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | Bert-text-classification_for_PyTorch/transformers/utils/notification_service.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | Bert-text-classification_for_PyTorch/transformers/utils/notification_service.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://download.microsoft.com/download/D/4/6/D46FF87A-F6B9-4252-AA8B-3604ED519838/MSRParaphraseCorpus.msi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_table.py | Bert-text-classification_for_PyTorch/transformers/utils/check_table.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Bert-text-classification_for_PyTorch/transformers/utils/check_repo.py | https://github.com/huggingface/doc-builder | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/xlm_roberta_xl/test_modeling_xlm_roberta_xl.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/xlm/test_tokenization_xlm.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/wavlm/test_modeling_wavlm.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert-text-classification_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/hf-internal-testing/processor_with_lm/tree/main | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert-text-classification_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert-text-classification_for_PyTorch/transformers/tests/wav2vec2/test_tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/wav2vec2/test_modeling_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/wav2vec2/test_modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_flax_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | Bert-text-classification_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | Bert-text-classification_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/14016 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/repo_utils/test_check_copies.py | Bert-text-classification_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/albert.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/utils/test_model_card.py | Bert-text-classification_for_PyTorch/transformers/tests/utils/test_model_card.py | https://arxiv.org/pdf/1810.03993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/tests/utils/test_add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/unispeech_sat/test_modeling_unispeech_sat.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/unispeech/test_modeling_unispeech.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/trainer/test_trainer.py | Bert-text-classification_for_PyTorch/transformers/tests/trainer/test_trainer.py | https://github.com/huggingface/transformers/issues/12970 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/tokenization/test_tokenization_fast.py | Bert-text-classification_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/transformers/pull/12550 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/tokenization/test_tokenization_fast.py | Bert-text-classification_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/tokenizers/issues/537 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert-text-classification_for_PyTorch/transformers/tests/test_modeling_tf_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert-text-classification_for_PyTorch/transformers/tests/test_modeling_common.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert-text-classification_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | Bert-text-classification_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/tests/tapas/test_tokenization_tapas.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/tapas/test_modeling_tf_tapas.py | Bert-text-classification_for_PyTorch/transformers/tests/tapas/test_modeling_tf_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/tapas/test_modeling_tf_tapas.py | Bert-text-classification_for_PyTorch/transformers/tests/tapas/test_modeling_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/t5/test_modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/sew_d/test_modeling_sew_d.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/sew/test_modeling_sew.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/roberta/test_tokenization_roberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/roberta/test_modeling_roberta.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/reformer/test_tokenization_reformer.py | Bert-text-classification_for_PyTorch/transformers/tests/reformer/test_tokenization_reformer.py | https://github.com/huggingface/transformers/pull/11737#issuecomment-850769064 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/perceiver/test_modeling_perceiver.py | Bert-text-classification_for_PyTorch/transformers/tests/reformer/test_modeling_reformer.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/tests/realm/test_tokenization_realm.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/tests/plbart/test_tokenization_plbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_zero_shot.py | Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13846 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_zero_shot.py | Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13381#issuecomment-912343499 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_token_classification.py | Bert-text-classification_for_PyTorch/transformers/tests/pipelines/test_pipelines_token_classification.py | https://github.com/huggingface/transformers/pull/4987 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/phobert/test_tokenization_phobert.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | Bert-text-classification_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/perceiver/test_modeling_perceiver.py | Bert-text-classification_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_tokenization_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/pegasus/test_tokenization_pegasus.py | https://github.com/google-research/bigbird/raw/master/bigbird/vocab/pegasus.model | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/pegasus/test_modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/openai/test_tokenization_openai.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/tests/mbart50/test_tokenization_mbart50.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/tests/mbart/test_tokenization_mbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/mbart/test_modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/marian/test_modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert-text-classification_for_PyTorch/transformers/tests/m2m_100/test_tokenization_m2m_100.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/led/test_modeling_led.py | Bert-text-classification_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/allenai/longformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/led/test_modeling_led.py | Bert-text-classification_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/huggingface/transformers/pull/9278#issue-544709661 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/layoutxlm/test_processor_layoutxlm.py | Bert-text-classification_for_PyTorch/transformers/tests/layoutxlm/test_processor_layoutxlm.py | https://www.industrydocuments.ucsf.edu/docs/snbx0223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/tests/layoutlmv2/test_tokenization_layoutlmv2.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert-text-classification_for_PyTorch/transformers/tests/layoutlmv2/test_modeling_layoutlmv2.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/ibert/test_modeling_ibert.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/hubert/test_modeling_tf_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/hubert/test_modeling_hubert.py | Bert-text-classification_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/pull/3572 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/gpt2/test_tokenization_gpt2.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/fsmt/test_tokenization_fsmt.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/fixtures/tests_samples/wiki_text/wiki_00 | Bert-text-classification_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/fixtures/tests_samples/wiki_text/wiki_00 | Bert-text-classification_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | Bert-text-classification_for_PyTorch/transformers/tests/encoder_decoder/test_modeling_tf_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | Bert-text-classification_for_PyTorch/transformers/tests/deit/test_modeling_deit.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/deepspeed/test_deepspeed.py | Bert-text-classification_for_PyTorch/transformers/tests/deepspeed/test_deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/deberta/test_tokenization_deberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_text.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_audio.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/ctrl/test_tokenization_ctrl.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/blenderbot_small/test_modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/blenderbot/test_modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/big_bird/test_tokenization_big_bird.py | Bert-text-classification_for_PyTorch/transformers/tests/big_bird/test_tokenization_big_bird.py | https://github.com/google-research/bigbird/blob/master/bigbird/vocab/gpt2.model?raw=true | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert-text-classification_for_PyTorch/transformers/tests/bertweet/test_tokenization_bertweet.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/tests/bert/test_tokenization_bert.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/barthez/test_tokenization_barthez.py | Bert-text-classification_for_PyTorch/transformers/tests/barthez/test_tokenization_barthez.py | https://github.com/huggingface/transformers/issues/11457 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert-text-classification_for_PyTorch/transformers/tests/bart/test_modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/auto/test_tokenization_auto.py | Bert-text-classification_for_PyTorch/transformers/tests/auto/test_tokenization_auto.py | https://github.com/huggingface/transformers/pull/13251 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/fx.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/55888 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/pt/multilingual.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/pt/multilingual.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/perf_train_gpu_many.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/facebookresearch/fairscale | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/deepspeed.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/microsoft/deepspeed | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/huggingface/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_pt_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_pt_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/pytorch/pytorch/issues/16266 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/tokenization_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://en.wikipedia.org/wiki/Trie | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/tokenization_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/issues/1133 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/fastai/fastai/blob/master/tests/utils/text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/64789046/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/34333710/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/59041913/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/testing_utils.py | https://docs.python.org/3/library/asyncio-subprocess.html#asyncio.asyncio.subprocess.Process.wait | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/zero_shot_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_image_classification.py | https://huggingface.co/models?filter=zero-shot-image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/zero_shot_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_classification.py | https://huggingface.co/models?search=nli | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/token_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/token_classification.py | https://huggingface.co/models?filter=token-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=text2text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=summarization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=translation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text_generation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/huggingface/transformers/issues/14033#issuecomment-948385227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/text_classification.py | https://huggingface.co/models?filter=text-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/table_question_answering.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/table_question_answering.py | https://huggingface.co/models?filter=table-question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/question_answering.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://huggingface.co/models?filter=question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/question_answering.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://github.com/facebookresearch/DrQA | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/object_detection.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/object_detection.py | https://huggingface.co/models?filter=object-detection | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/image_segmentation.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/image_segmentation.py | https://huggingface.co/models?filter=image-segmentation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/image_classification.py | https://huggingface.co/models?filter=image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/fill_mask.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://github.com/huggingface/transformers/pull/10222 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/feature_extraction.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/conversational.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/conversational.py | https://huggingface.co/models?filter=conversational | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/base.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://huggingface.co/transformers/main_classes/pipelines.html#pipeline-batching | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/audio_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/audio_classification.py | https://huggingface.co/models?filter=audio-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1904.09237 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://github.com/OpenNMT/OpenNMT-tf/blob/master/opennmt/optimizers/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization.py | https://discuss.huggingface.co/t/t5-finetuning-tips/684/3 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/huggingface/transformers/blob/8395f14de6068012787d83989c3627c3df6a252b/src/transformers/optimization.py#L505 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/kernels/yoso/fast_lsh_cumulation_cuda.cu | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation_cuda.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/kernels/yoso/fast_lsh_cumulation.cu | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/yoso.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/convert_yoso_pytorch_to_pytorch.py | https://github.com/mlpen/YOSO | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlnet.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlnet.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/transformers/quickstart.html#using-the-past | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://github.com/zihangdai/xlnet/issues/41#issuecomment-505102587 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-prophetnet.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/modeling_xlm_prophetnet.py | https://huggingface.co/models?filter=xprophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/blob/master/tools/lowercase_and_remove_accent.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/normalise-romanian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/remove-diacritics.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kyt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/awesome-transformers.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/PyThaiNLP/pythainlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/chezou/Mykytea-python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kytea | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/fxsjy/jieba | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/tree/master/tools | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://github.com/pytorch/pytorch/issues/32590 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm/commit/b94ec76c36f02fb2b0bf0dcb0b8554a2185173cd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://github.com/bootphon/phonemizer#readme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/modeling_vit_mae.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://huggingface.co/models?filter=vit_mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mae.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://github.com/facebookresearch/mae | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/models?filter=vit-mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/issues/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/google/vit-base-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/vision-text-dual-encoder.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/vision-text-dual-encoder.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-12.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-10.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://huggingface.co/models?filter=vilt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/jnhwkim/ban-vqa/blob/master/train.py#L19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_1.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/configuration_vilt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/modeling_van.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://huggingface.co/models?filter=van | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/modeling_van.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://arxiv.org/abs/2106.13797 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/van.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://github.com/Visual-Attention-Network/VAN-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/configuration_trocr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/microsoft/trocr-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/dbe6a7a9ff1a364a8706bf5df58a1ca96d2fd9da/torch/nn/modules/adaptive.py#L138 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/adaptive.p | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/mem_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/transfo-xl.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/transfo-xl.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | https://stackoverflow.com/questions/2121874/python-pickling-after-changing-a-modules-directory/2121918#2121918 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/4908213eb4df7aa988573350278b44c4dbe3f71b/tapas/experiments/prediction_utils.py#L288 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/constants.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/text_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L414 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L293 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_annotation_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/tapas.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/modeling_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/modeling_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/run_task_main.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/hparam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/tree/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/open_model_proposals/ADD_BIG_BIRD.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://medium.com/huggingface/from-tensorflow-to-pytorch-265f40ef2a28 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L1624 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/transformer_layers.py#L56 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/attention.py#L136 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mt5/modeling_mt5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L666 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/modeling_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/modeling_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/configuration_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text_2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text_2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/configuration_sew_d.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/modeling_segformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/modeling_segformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_utils.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/roberta.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/roberta.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/retribert/modeling_retribert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://huggingface.co/models?filter=retribert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/modeling_resnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://huggingface.co/models?filter=resnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/configuration_resnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/reformer.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://huggingface.co/models?filter=reformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/1509.02897.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/2001.04451.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/lucidrains/reformer-pytorch/blob/master/reformer_pytorch/reversible.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/modeling_tf_rag.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/pdf/2005.11401.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/modeling_tf_rag.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://stackoverflow.com/questions/52129909/tensorflow-equivalent-of-torch-gather | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/quantization-qdqbert/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/NVIDIA/TensorRT/tree/master/tools/pytorch-quantization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/prophetnet.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://huggingface.co/models?filter=prophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/prophetnet.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://github.com/microsoft/ProphetNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/modeling_poolformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/poolformer.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | https://github.com/sail-sg/poolformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/modeling_poolformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/configuration_poolformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://discuss.pytorch.org/t/is-there-any-layer-like-tensorflows-space-to-depth-function/3487/15 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://gist.github.com/sumanmichael/4de9dee93f972d47c80c4ade8e149ea6 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/configuration_perceiver.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/configuration_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/openai-gpt.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/openai-gpt.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/configuration_openai.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mt5.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mt5/configuration_mt5.py | https://huggingface.co/google/mt5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://huggingface.co/models?filter=mobilebert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/modeling_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/pdf/2004.02984.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/mmbt/modeling_mmbt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://github.com/facebookresearch/mmbt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/perf_train_gpu_one.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/huggingface/transformers/issues/13906 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/modeling_megatron_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://huggingface.co/models?filter=megatron_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/perf_train_gpu_one.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/configuration_megatron_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/nvidia/megatron-bert-uncased-345m | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mbart.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mbart.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/translation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/modeling_maskformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/2107.06278 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/maskformer.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | https://github.com/facebookresearch/MaskFormer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/modeling_maskformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/datasets/scene_parse_150 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/configuration_maskformer.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/microsoft/swin-base-patch4-window12-384-in22k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/modeling_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/modeling_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://en.wikipedia.org/wiki/Insular_Celtic_languages | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/modeling_luke.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://huggingface.co/models?filter=luke | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://arxiv.org/abs/2010.01057 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/longformer.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/longformer.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/modeling_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://github.com/huggingface/transformers/blob/ac3cb660cad283163f7c73cad511124e845ca388/src/transformers/models/bart/modeling_bart.py#L1153 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://guillaumejaume.github.io/FUNS | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/clovaai/co | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/tasks/document_question_answering.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://rrc.cvc.uab.es/?ch=17 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/tasks/document_question_answering.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://github.com/microsoft/unilm/blob/master/layoutlmft/layoutlmft/models/layoutlmv2/detectron2_config.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://guillaumejaume.github.io/FUNSD/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://rrc.cvc.uab.es/?ch=13 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://huggingface.co/models?filter=imagegpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07447 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/audio-classification/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/audio-classification/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/modeling_tf_gptj.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://huggingface.co/models?filter=gptj | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/codegen/modeling_codegen.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/configuration_gptj.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/models?filter=gpt_j | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/es/tasks/language_modeling.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neox/configuration_gpt_neox.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/codegen/modeling_codegen.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neox/configuration_gpt_neox.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neo/configuration_gpt_neo.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://github.com/pytorch/fairseq/tree/master/examples/wmt19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fsmt/modeling_fsmt.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://huggingface.co/models?filter=fsmt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/eval-facebook-wmt19.sh | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/google-research/google-research/blob/master/f_net/fourier.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/master/generated/torch.vmap.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://arxiv.org/abs/2105.03824 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/flaubert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/flaubert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/electra.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/electra.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/electra/configuration_electra.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/huggingface/transformers/commit/614fef1691edb806de976756d4948ecbcd0c0ca3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/distilbert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/google-research/bert | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/distilbert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/backbone.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/issues/108#issuecomment-650269223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/matcher.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/detr.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://giou.stanford.edu/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/misc.py#L306 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/image_processing_yolos.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/pytorch/pytorch/issues/50276 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/image_transforms.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/cocodataset/panopticapi/blob/master/panopticapi/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L33 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L50 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py#L258 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L218 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L241 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/configuration_detr.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://rwightman.github.io/pytorch-image-models/#load-a-pretrained-model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/modeling_deit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | https://github.com/facebookresearch/deit/blob/ab5715372db8c6cad5740714b2216d55aeae052e/datasets.py#L103 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/modeling_deit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/configuration_deit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-distilled-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://huggingface.co/models?filter=deberta-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/modeling_tf_deberta.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://huggingface.co/models?filter=DeBERTa | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://huggingface.co/models?filter=data2vec-text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/data2vec.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/data2vec.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_audio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/configuration_data2vec_text.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/facebook/data2vec-text-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/ctrl.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/ctrl.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/cpm/tokenization_cpm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/cpm/tokenization_cpm.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convnextv2.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/facebookresearch/ConvNeXt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convbert/tokenization_convbert_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/tokenization_clip_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://github.com/huggingface/tokenizers/issues/872 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_tf_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_tf_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/configuration_clip.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/tokenization_canine.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/tokenization_canine.py | https://github.com/google-research/language/blob/master/language/canine/special_codepoints.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/google-research/big_transfer/blob/49afe42338b62af9fbe18f0258197a33ee578a6b/bit_tf2/models.py#L36-L38 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/canine.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/camembert/modeling_camembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/camembert/modeling_camembert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/byt5/tokenization_byt5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | https://github.com/alexa/bort/blob/master/bort/bort.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot/configuration_blenderbot.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/modeling_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/modeling_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/tokenization_big_bird.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | cgpotts@stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | ewan@inf.ed.ac.uk | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://nltk.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/nltk/nltk/issues/2409 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://en.wikipedia.org/wiki/List_of_emoticons | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://gist.github.com/winzig/8894715 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | foo.na@example.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/scrapy/w3lib/blob/master/w3lib/html.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://en.wikipedia.org/wiki/ISO/IEC_8859-1#Similar_character_sets | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://pypi.org/project/fugashi/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/ipadic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/beit/modeling_beit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/upernet.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1807.10221 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/upernet/modeling_upernet.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1411.4038 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/beit/modeling_beit.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/albert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/albert/modeling_tf_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://github.com/google-research/albert/blob/master/modeling.py#L971-L993 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/albert.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/albert/configuration_albert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://arxiv.org/pdf/2001.08361.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/huggingface/transformers/pull/11471 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://www.tensorflow.org/tfx/serving/serving_basic | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/tensorflow/blob/00fad90125b18b80fe054de1055770cfb8fe4ba3/tensorflow/python/keras/engine/network.py#L1339-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_pytorch_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://github.com/tensorflow/tensorflow/blob/ee16fcac960ae660e0e4496658a366e2f745e1f0/tensorflow/python/keras/engine/network.py#L1352-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/google/flax/issues/1261 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modelcard.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/modelcard.py | https://arxiv.org/abs/1810.03993 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/modelcard.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/integration_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://github.com/huggingface/transformers/issues/11565 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/integration_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://docs.wandb.ai/integrations/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/site/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://pypi.org/project/azureml-sdk/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://neptune.ai | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/image_utils.py | http: | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/image_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/image_utils.py | https://pytorch.org/vision/stable/transforms.html#torchvision.transforms.Resize | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/hf_argparser.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/hf_argparser.py | https://stackoverflow.com/questions/15008758/parsing-boolean-values-with-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/configuration_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/issues/14081 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/abs/1909.05858 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/logits_process.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/facebookresearch/ParlAI/blob/master/parlai/core/torch_generator_agent.py#L1350 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/ashwinkalyan/dbs/blob/master/dbs/beam_utils.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/pytorch/pytorch/blob/2289a12f21c54da93bf5d696e3f9aea83dd9c10d/torch/testing/_internal/common_cuda.py#L51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tqdm/tqdm/blob/master/tqdm/autonotebook.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/sentencepiece#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/protocolbuffers/protobuf/tree/master/python#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/faiss/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/flax | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rspeer/python-ftfy/tree/master#installing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/quantization-qdqbert/Dockerfile | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/tapas.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://hf.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/optuna/optuna/blob/master/optuna/integration/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/doc.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | http://stackoverflow.com/a/6528148/190597 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/deepspeed.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1394#issuecomment-937405374 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/deepspeed/test_deepspeed.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/data/processors/xnli.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/data/processors/xnli.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/run_classifier.py#L207 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/data/processors/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/data/metrics/__init__.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/data/datasets/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/convert_slow_tokenizer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/convert_graph_to_onnx.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/convert_graph_to_onnx.py | https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/user.py | https://git-lfs.github.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/commands/lfs.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/lfs.py | https://github.com/git-lfs/git-lfs/blob/master/docs/custom-transfers.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pythonprofilers/memory_profiler/blob/895c4ac7a08020d66ae001e24067da6dcea42451/memory_profiler.py#L239 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://psutil.readthedocs.io/en/latest/#psutil.Process.memory_info | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/tensorflow/tensorflow/issues/20218#issuecomment-416771802 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pytorch/xla/issues/2180 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_tf.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://github.com/NVIDIA/apex/issues/439 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.0841 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1612.08083 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1702.03118 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1710.05941v1 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert-text-classification_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert-text-classification_for_PyTorch/transformers/setup.py | https://test.pypi.org/legacy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert-text-classification_for_PyTorch/transformers/setup.py | https://testpypi.python.org/pypi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert-text-classification_for_PyTorch/transformers/setup.py | https://github.com/pypa/pip/issues/5466 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert-text-classification_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh | https://huggingface.co/Helsinki-NLP/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert-text-classification_for_PyTorch/transformers/scripts/stale.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/pegasus/build_test_sample_spm_no_bos.py | Bert-text-classification_for_PyTorch/transformers/scripts/pegasus/build_test_sample_spm_no_bos.py | https://raw.githubusercontent.com/google/sentencepiece/master/data/botchan.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1914?run_id=6724 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-ru | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-ru-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-de-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-big | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/TROUBLESHOOT.md | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://github.com/huggingface/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-6-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/TROUBLESHOOT.md | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://github.com/huggingface/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://www.statmt.org/wmt16/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://matrix.statmt.org/test_sets/newstest2016.tgz?1504722372 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/eval-facebook-wmt19.sh | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | https://arxiv.org/abs/1912.08777 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/seq2seq-distillation/_test_bash_script.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/_test_bash_script.py | https://cdn-datasets.huggingface.co/translation/wmt_en_ro-tr40k-va0.5k-te0.5k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/test_distributed_retriever.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/test_distributed_retriever.py | https://stackoverflow.com/questions/54338013/parallel-import-a-python-file-from-sibling-folder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/pplm/run_pplm.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/performer/modeling_flax_performer_utils.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer_utils.py | https://github.com/google-research/google-research/blob/master/performer/fast_self_attention/fast_self_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/performer/modeling_flax_performer.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | https://msdata.visualstudio.com/Vienna/_workitems/edit/1486599 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/arunmallya/piggyback | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/allenai/hidden-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/NervanaSystems/distiller/blob/2291fdcc2ea642a98d4e20629acb5a9e2e04b4e6/distiller/pruning/automated_gradual_pruner.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://yjernite.github.io/lfqa.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://research.google/pubs/pub49029/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://arxiv.org/abs/1907.09190 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://en.wikipedia.org/wiki/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/model_parallel/partitions.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/partitions.py | https://github.com/google-research/google-research/blob/master/flax_models/t5x/partitions.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/fsner/README.md | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/fsner/src/fsner/model.py | https://arxiv.org/abs/2008.10570 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/fsner/setup.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/utils.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/train.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/lm_seqs_dataset.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/grouped_batch_sampler.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/grouped_batch_sampler.py | https://github.com/pytorch/vision/blob/master/references/detection/group_by_aspect_ratio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/model/net.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/issues/2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/deebert/test_glue_deebert.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/deebert/test_glue_deebert.py | https://github.com/huggingface/transformers/issues/10560 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/scripts/human_eval.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/codeparrot/scripts/human_eval.py | https://stackoverflow.com/questions/60804599/python-multiprocessing-keeps-spawning-the-whole-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://github.com/huggingface/transformers/blob/783d7d2629e97c5f0c5f9ef01b8c66410275c204/examples/research_projects/bertology/run_bertology.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_bertology.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://github.com/pmichel31415/are-16-heads-really-better-than-1 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://cs.nyu.edu/~kcho/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/abisee/cnn-dailymail/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | Bert-text-classification_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/nlpyang/PreSumm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/xla_spawn.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-classification/run_xnli.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mim.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mim.py | https://github.com/microsoft/SimMIM/blob/main/data/data_simmim.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://arxiv.org/abs/2111.06377 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mae.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://github.com/facebookresearch/mae/blob/main/main_pretrain.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.sh | Bert-text-classification_for_PyTorch/transformers/examples/legacy/token-classification/run.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/xla_spawn.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/seq2seq/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/eval.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_swag.py | https://github.com/google-research/bert/issues/38 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_swag.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_swag.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_swag.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_openai_gpt.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/huggingface/pytorch-openai-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_openai_gpt.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/openai/finetune-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_camembert.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/run_camembert.py | https://github.com/pytorch/fairseq/blob/master/fairseq/models/roberta/hub_interface.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.sh | Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/huggingface/transformers/issues/3159 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert-text-classification_for_PyTorch/transformers/examples/legacy/pytorch-lightning/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/vision/run_image_classification.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/t5_tokenizer_model.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/t5_tokenizer_model.py | https://github.com/yandex-research/DeDLOC/blob/main/sahajbert/tokenizer/tokenizer_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2466 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://arxiv.org/pdf/1910.10683.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/master/t5/data/preprocessors.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | Bert-text-classification_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docker/transformers-pytorch-tpu/Dockerfile | Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://github.com/conda/conda/issues/8385 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert-text-classification_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert-text-classification_for_PyTorch/run_xnli.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert-text-classification_for_PyTorch/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert-text-classification_for_PyTorch/run_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/README.md | Bert-text-classification_for_PyTorch/glue.py | https://gluebenchmark.com/ | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/tests/vit_mae/test_modeling_vit_mae.py | https://discuss.pytorch.org/t/random-seed-that-spans-across-devices/19735 | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/master/model_doc/albert.ht | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/59569 | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/utils/__init__.py | https://huggingface.co/transformers/installation.html#installing-from-source | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/utils/__init__.py | https://huggingface.co/transformers/examples.html | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://github.com/huggingface/transformers/tree/master/examples/tensorflow | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://github.com/kpu/kenlm/archive/master.zip | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/facebook/wavlm-base-960h | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/feature_extraction_vilt.py | https://github.com/dandelin/ViLT/blob/3db8b5035464afee84d951bf6322e1b27f1d072d/vilt/transforms/utils.py#L5 | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/van-base | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-960h | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-960h | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/facebook/unispeech-base-960h | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/rusty1s/pytorch_scatter | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dpr/modeling_tf_dpr.py#L91 | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bart.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/mpnet-base | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/configuration_imagegpt.py | https://huggingface.co/imagegpt | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/benjaminp/six | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py | https://huggingface.co/ctrl | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/convert_bert_original_tf2_checkpoint_to_pytorch.py | https://github.com/tensorflow/models/tree/master/official/nlp/bert | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/microsoft/beit-base-patch16-224-in22k | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/google/flax/blob/master/examples/wmt/train.py#L254 | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rusty1s/pytorch_scatter | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tensorflow/tensorflow/blob/00fad90125b18b80fe054de1055770cfb8fe4ba3/tensorflow/python/keras/engine/network.py#L1380 | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/sgugger/my-finetuned-bert | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/data/datasets/language_modeling.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_mlm.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/src/transformers/data/datasets/language_modeling.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_mlm_wwm.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/setup.py | https://github.com/allenai/allennlp/blob/master/setup.py | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/scripts/stale.py | https://github.com/huggingface/transformers/blob/master/CONTRIBUTING.md | 源码实现 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torch- | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torch- | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torch- | 模型相关说明 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/flax/vision/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt | https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/big_bird/requirements.txt | https://github.com/huggingface/transformers@master | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | -| 开发引入 | / |Bert-text-classification_for_PyTorch/transformers/tests/sagemaker/scripts/tensorflow/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/glue.py | https://gluebenchmark.com/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | user.email配置邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/.circleci/config.yml | https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/CITATION.cff | https://www.aclweb.org/anthology/2020.emnlp-demos.6 | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torch-$(python -c "from torch import version; print(version.__version__.split(''+'')[0])")+cpu.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://pypi.ngc.nvidia.com | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1914?run_id=6724 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/{experiment.id} | experiment地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_225.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07448 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | glue数据集diagnostic链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2498860800&Signature=DuQ2CSPt2Yfre0C%2BiISrVYrIFaZH1Lc7hBVZDD4ZyR7fZYOMNOUGpi8QxBmTNOrNPjR3z1cggo7WXFfrgECP6FBJSsURv8Ybrue8Ypt%2FTPxbuJ0Xc2FhDi%2BarnecCBFO77RSbfuz%2Bs95hRrYhTnByqu3U%2FYZPaj3tZt5QdfpH2IUROY8LiBXoXS46LE%2FgOQc%2FKN%2BA9SoscRDYsnxHfG0IjXGwHN%2Bf88q6hOmAxeNPx6moDulUF6XMUAaXCSFU%2BnRO2RDL9CapWxj%2BDl7syNyHhB7987hZ80B%2FwFkQ3MEs8auvt5XW1%2Bd4aCU7ytgM69r8JDCwibfhZxpaa4gd50QXQ%3D%3D | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/url.ini | thomas@huggingface.co | 邮箱地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/url.ini | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/url.ini | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert-text-classification_for_PyTorch/url.ini | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/public_address_statement.md index e5fa59007696ed87b6fe4fae4b71e7a36cfee477..c1b7de468ad1fb6bc5fb7b49b4097466acd225aa 100644 --- a/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/public_address_statement.md @@ -1,2822 +1,363 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://scikit-learn.org/stable/modules/generated/sklearn.metrics.accuracy_score.html|ACC函数中对url的设置| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/GoogleCloudPlatform/ml-testing-accelerators.git|ml-testing-accelerators工具包的git下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | config.yml中对usr.email的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx| Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml |https://github.com/facebookresearch/detectron2.git|detectron2模型在开源社区中的源码链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/CITATION.cff |https://github.com/huggingface/transformers|CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_zh-hant.md| Bert_Chinese_ID3433_for_PyTorch/transformers/CITATION.cff |https://www.aclweb.org/anthology/2020.emnlp-demos.6|CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers的git链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://data.pyg.org/whl/torch|Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/kpu/kenlm/archive/master.zip|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_repo.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/doc-builder |Dockerfile文件中transformers的git链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/kpu/kenlm/archive/master.zip|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-gpu/Dockerfile |https://github.com/NVIDIA/apex|Dockerfile文件中apex的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中apex在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/deepspeed/test_deepspeed.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/microsoft/DeepSpeed |Dockerfile文件中DeepSpeed在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers在开源社区中的的git链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/kpu/kenlm/archive/master.zip|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docker/transformers-pytorch-tpu/Dockerfile| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh|Dockerfile文件中miniconda在开源社区中的的sh链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://github.com/huggingface/transformers.git|transformers模型在开源社区中的源码git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-tensorflow-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers在开源社区中的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/valid.txt|CONLL2003数据集在开源社区上的valid.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/test.txt|CONLL2003数据集在开源社区上的test.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/train.txt|CONLL2003数据集在开源社区上的train.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/bertabs/configuration_bertabs.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertabs/configuration_bertabs.py |https://huggingface.co/remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization/resolve/main/config.json|bertabs-finetuned-cnndm模型在开源社区上的config.json的下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://github.com/huggingface/transformers/tree/master/examples/research_projects/fsner/setup.py |setuptools的url配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/trainer/test_trainer.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/fsner/setup.py |https://github.com/huggingface/transformers/issues|setuptools的Bug Tracker在开源社区中的链接| -| 开源代码引入 | https://github.com/huggingface/transformers | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/big_bird/requirements.txt |https://github.com/huggingface/transformers@master|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://huggingface.co/front/assets/huggingface_logo.svg|获取huggingface_logo的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://en.wikipedia.org/wiki |配置wiki的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt |https://github.com/huggingface/transformers.git|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt |https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh |https://cdn-datasets.huggingface.co/summarization/cnn_tiny.tgz|cnn_tiny数据集在开源社区中的tgz链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt |https://github.com/huggingface/transformers.git|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py| Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1x_G2cjvM1nW5hjAB8-vWxRqtQTlmIaQU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1oA2aqZlVNj5FarxBlNXEHpBS4lRetTzU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1Wup2D318QYBFPW_NKI1mfP_hXOfmUI9r|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1mNufoynJ9-Zy1kJh2TA_lHm2squji0i9|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1iO7um-HWoNoRKDtw27YUSgyeubn9uXqj|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1j6z9fYdlUyOYsh7KJoumRlr1yHczxR5T|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1yT7ZjqfvUYOBXvMjeY8uGRHQFWoSo8Q5|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=15gAzHeRUCs-QV8vHeTReMPEh1j8excNE|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py |https://github.com/huggingface/transformers|wmt16数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |https://github.com/huggingface/transformers|wmt19数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1907?run_id=6937 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1914?run_id=6724 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1909?run_id=6862 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |http://matrix.statmt.org/matrix/output/1902?run_id=6750 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |https://github.com/huggingface/transformers|wmt19数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/tatoeba/upload_models.sh| Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh |https://huggingface.co/Helsinki-NLP/$model_name|Helsinki-NLP在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py | Bert_Chinese_ID3433_for_PyTorch/transformers/setup.py |https://github.com/huggingface/transformers|setuptools的url配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/commands/add_new_model_like.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py |https://huggingface.co/{new_model_patterns.checkpoint}/resolve/main/config.json|add_new_model_like.py在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_tf_utils.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py |https://www.tensorflow.org/tfx/serving/serving_basic|获取tensorflow开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py |https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24|获取tensorflow开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-base-v1/resolve/main/config.json|"albert-base-v1"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-large-v1/resolve/main/config.json|"albert-large-v1"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-xlarge-v1/resolve/main/config.json|"albert-xlarge-v1"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-xxlarge-v1/resolve/main/config.json|"albert-xxlarge-v1"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-base-v2/resolve/main/config.json|"albert-base-v2"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-large-v2/resolve/main/config.json|"albert-large-v2"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-xlarge-v2/resolve/main/config.json|"albert-xlarge-v2"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py |https://huggingface.co/albert-xxlarge-v2/resolve/main/config.json|"albert-xxlarge-v2"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_tokenization_common.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-base-v1/resolve/main/spiece.model|"albert-base-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-large-v1/resolve/main/spiece.model|"albert-large-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model|"albert-xlarge-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model|"albert-xxlarge-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-base-v2/resolve/main/spiece.model|"albert-base-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-large-v2/resolve/main/spiece.model|"albert-large-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model|"albert-xlarge-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py |https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model|"albert-xxlarge-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/test_tokenization_common.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-base-v1/resolve/main/spiece.model|"albert-base-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-large-v1/resolve/main/spiece.model|"albert-large-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model|"albert-xlarge-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model|"albert-xxlarge-v1"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-base-v2/resolve/main/spiece.model|"albert-base-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-large-v2/resolve/main/spiece.model|"albert-large-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model|"albert-xlarge-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model|"albert-xxlarge-v2"模型在开源社区上的spiece.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-base-v1/resolve/main/tokenizer.json|"albert-base-v1"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-large-v1/resolve/main/tokenizer.json|"albert-large-v1"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xlarge-v1/resolve/main/tokenizer.json|"albert-xlarge-v1"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xxlarge-v1/resolve/main/tokenizer.json|"albert-xxlarge-v1"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-base-v2/resolve/main/tokenizer.json|"albert-base-v2"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-large-v2/resolve/main/tokenizer.json|"albert-large-v2"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xlarge-v2/resolve/main/tokenizer.json|"albert-xlarge-v2"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py |https://huggingface.co/albert-xxlarge-v2/resolve/main/tokenizer.json|"albert-xxlarge-v2"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/configuration_bart.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py |https://huggingface.co/facebook/bart-large/resolve/main/config.json|"facebook/bart-large"模型在开源社区上的config.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-base/resolve/main/vocab.json|"facebook/bart-base"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large/resolve/main/vocab.json|"facebook/bart-large"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json|"facebook/bart-large-mnli"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json|"facebook/bart-large-cnn"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json|"facebook/bart-large-xsum"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json|"yjernite/bart_eli5"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-base/resolve/main/merges.txt|"facebook/bart-base"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large/resolve/main/merges.txt|"facebook/bart-large"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt|"facebook/bart-large-mnli"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt|"facebook/bart-large-cnn"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt|"facebook/bart-large-xsum"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py |https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt|"yjernite/bart_eli5"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-base/resolve/main/vocab.json|"facebook/bart-base"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large/resolve/main/vocab.json|"facebook/bart-large"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json|"facebook/bart-large-mnli"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json|"facebook/bart-large-cnn"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json|"facebook/bart-large-xsum"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json|"yjernite/bart_eli5"模型在开源社区上的vocab.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-base/resolve/main/merges.txt|"facebook/bart-base"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large/resolve/main/merges.txt|"facebook/bart-large"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt|"facebook/bart-large-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt|"facebook/bart-large-cnn"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt|"facebook/bart-large-xsum"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt|"yjernite/bart_eli5"模型在开源社区上的merges.txt"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-base/resolve/main/tokenizer.json|"facebook/bart-base"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large/resolve/main/tokenizer.json|"facebook/bart-large"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-mnli/resolve/main/tokenizer.json|"facebook/bart-large-mnli"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-cnn/resolve/main/tokenizer.json|"facebook/bart-large-cnn"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/facebook/bart-large-xsum/resolve/main/tokenizer.json|"facebook/bart-large-xsum"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py |https://huggingface.co/yjernite/bart_eli5/resolve/main/tokenizer.json|"yjernite/bart_eli5"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py |https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model|"moussaKam/mbarthez"模型在开源社区上的sentencepiece.bpe.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py |https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model|"moussaKam/barthez"模型在开源社区上的sentencepiece.bpe.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py |https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model|"moussaKam/barthez-orangesum-title"模型在开源社区上的sentencepiece.bpe.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py |https://huggingface.co/moussaKam/mbarthez/resolve/main/tokenizer.json|"moussaKam/mbarthez"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py |https://huggingface.co/moussaKam/barthez/resolve/main/tokenizer.json|"moussaKam/barthez"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py |https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/tokenizer.json|"moussaKam/barthez-orangesum-title"模型在开源社区上的tokenizer.json"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py |https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model|"moussaKam/mbarthez"模型在开源社区上的sentencepiece.bpe.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py |https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model|"moussaKam/barthez"模型在开源社区上的sentencepiece.bpe.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py |https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model|"moussaKam/barthez-orangesum-title"模型在开源社区上的sentencepiece.bpe.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py |https://huggingface.co/vinai/bartpho-syllable/resolve/main/sentencepiece.bpe.model|"vinai/bartpho-syllable"模型在开源社区上的sentencepiece.bpe.model"的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py |https://huggingface.co/vinai/bartpho-syllable/resolve/main/dict.txt|"vinai/bartpho-syllable"模型在开源社区上的dict.txt"的下载链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://huggingface.co/microsoft/beit-base-patch16-224-in22k/resolve/main/config.json |beit模型的config.json配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py |http://images.cocodataset.org/val2017/000000039769.jpg|beit模型的数据集url配置| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth|beit模型的pth配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qdqbert/configuration_qdqbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-uncased/resolve/main/config.json|bert模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-large-uncased/resolve/main/config.json|"bert-large-uncased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-cased/resolve/main/config.json|"bert-base-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-large-cased/resolve/main/config.json|"bert-large-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-multilingual-uncased/resolve/main/config.json|"bert-base-multilingual-uncased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-multilingual-cased/resolve/main/config.json|"bert-base-multilingual-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-chinese/resolve/main/config.json|"bert-base-chinese"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-german-cased/resolve/main/config.json|"bert-base-german-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/config.json|"bert-large-uncased-whole-word-masking"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/config.json|"bert-large-cased-whole-word-masking"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/config.json|"bert-large-uncased-whole-word-masking-finetuned-squad"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/config.json|"bert-large-cased-whole-word-masking-finetuned-squad"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/config.json|"bert-base-cased-finetuned-mrpc"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/config.json|"bert-base-german-dbmdz-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/config.json|"bert-base-german-dbmdz-uncased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/config.json|"cl-tohoku/bert-base-japanese"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/config.json|"cl-tohoku/bert-base-japanese-whole-word-masking"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/config.json|"cl-tohoku/bert-base-japanese-char"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/config.json|"cl-tohoku/bert-base-japanese-char-whole-word-masking"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/config.json|"TurkuNLP/bert-base-finnish-cased-v1"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/config.json|"TurkuNLP/bert-base-finnish-uncased-v1"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py |https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/config.json|"wietsedv/bert-base-dutch-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt|"bert-base-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt|"bert-large-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-cased/resolve/main/vocab.txt|"bert-base-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-cased/resolve/main/vocab.txt|"bert-large-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt|"bert-base-multilingual-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt|"bert-base-multilingual-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt|"bert-base-chinese"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt|"bert-base-german-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt|"bert-large-uncased-whole-word-masking"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt|"bert-large-cased-whole-word-masking"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt|"bert-large-uncased-whole-word-masking-finetuned-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt|"bert-large-cased-whole-word-masking-finetuned-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt|"bert-base-cased-finetuned-mrpc"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt|"bert-base-german-dbmdz-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt|"bert-base-german-dbmdz-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt|"TurkuNLP/bert-base-finnish-cased-v1"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt|"TurkuNLP/bert-base-finnish-uncased-v1"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt|"wietsedv/bert-base-dutch-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json|"bert-base-uncased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json|"bert-large-uncased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-cased/resolve/main/tokenizer.json|"bert-base-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-cased/resolve/main/tokenizer.json|"bert-large-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-multilingual-uncased/resolve/main/tokenizer.json|"bert-base-multilingual-uncased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-multilingual-cased/resolve/main/tokenizer.json|"bert-base-multilingual-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-chinese/resolve/main/tokenizer.json|"bert-base-chinese"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-german-cased/resolve/main/tokenizer.json|"bert-base-german-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/tokenizer.json|"bert-large-uncased-whole-word-masking"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/tokenizer.json|"bert-large-cased-whole-word-masking"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json|"bert-large-uncased-whole-word-masking-finetuned-squad"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json|"bert-large-cased-whole-word-masking-finetuned-squad"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/tokenizer.json|"bert-base-cased-finetuned-mrpc"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/tokenizer.json|"bert-base-german-dbmdz-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/tokenizer.json|"bert-base-german-dbmdz-uncased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/tokenizer.json|"TurkuNLP/bert-base-finnish-cased-v1"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/tokenizer.json|"TurkuNLP/bert-base-finnish-uncased-v1"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py |https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/tokenizer.json|"wietsedv/bert-base-dutch-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt|"bert-base-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt|"bert-large-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-cased/resolve/main/vocab.txt|"bert-base-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-large-cased/resolve/main/vocab.txt|"bert-large-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt|"bert-base-multilingual-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt|"bert-base-multilingual-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt|"bert-base-chinese"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt|"bert-base-german-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt|"bert-large-uncased-whole-word-masking"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt|"bert-large-cased-whole-word-masking"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt|"bert-large-uncased-whole-word-masking-finetuned-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt|"bert-large-cased-whole-word-masking-finetuned-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt|"bert-base-cased-finetuned-mrpc"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt|"bert-base-german-dbmdz-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt|"bert-base-german-dbmdz-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt|"TurkuNLP/bert-base-finnish-cased-v1"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt|"TurkuNLP/bert-base-finnish-uncased-v1"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py |https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt|"wietsedv/bert-base-dutch-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_generation/tokenization_bert_generation.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py |https://huggingface.co/google/bert_for_seq_generation_L-24_bbc_encoder/resolve/main/spiece.model|"bert_for_seq_generation"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py |https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/vocab.txt|"cl-tohoku/bert-base-japanese"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py |https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/vocab.txt|"cl-tohoku/bert-base-japanese-whole-word-masking"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py |https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/vocab.txt|"cl-tohoku/bert-base-japanese-char"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py |https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/vocab.txt|"cl-tohoku/bert-base-japanese-char-whole-word-masking"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py |https://huggingface.co/vinai/bertweet-base/resolve/main/vocab.txt|"vinai/bertweet-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py |https://huggingface.co/vinai/bertweet-base/resolve/main/bpe.codes|"vinai/bertweet-base"模型在开源社区上的bpe.codes的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py |https://huggingface.co/google/bigbird-roberta-base/resolve/main/config.json|"google/bigbird-roberta-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py |https://huggingface.co/google/bigbird-roberta-large/resolve/main/config.json|"google/bigbird-roberta-large"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py |https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/config.json|"google/bigbird-base-trivia-itc"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py |https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model|"google/bigbird-roberta-base"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py |https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model|"google/bigbird-roberta-large"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py |https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model|"google/bigbird-base-trivia-itc"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py |https://huggingface.co/google/bigbird-roberta-base/resolve/main/tokenizer.json|"google/bigbird-roberta-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py |https://huggingface.co/google/bigbird-roberta-large/resolve/main/tokenizer.json|"google/bigbird-roberta-large"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py |https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/tokenizer.json|"google/bigbird-base-trivia-itc"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py |https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model|"google/bigbird-roberta-base"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py |https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model|"google/bigbird-roberta-large"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py |https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model|"google/bigbird-base-trivia-itc"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py |https://huggingface.co/google/bigbird-pegasus-large-arxiv/resolve/main/config.json|"google/bigbird-pegasus-large-arxiv"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py |https://huggingface.co/google/bigbird-pegasus-large-pubmed/resolve/main/config.json|"google/bigbird-pegasus-large-pubmed"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py |https://huggingface.co/google/bigbird-pegasus-large-bigpatent/resolve/main/config.json|"google/bigbird-pegasus-large-bigpatent"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/configuration_blenderbot.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/config.json|"facebook/blenderbot-3B"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json|"vocab_file"模型在开源社区上的 "ht的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt|"merges_file"模型在开源社区上的 "ht的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json|"facebook/blenderbot-3B"模型在开源社区上的tokenizer_config.jso的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json|"vocab_file"模型在开源社区上的 "ht的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt|"merges_file"模型在开源社区上的 "ht的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py |https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json|"facebook/blenderbot-3B"模型在开源社区上的tokenizer_config.jso的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/config.json|"facebook/blenderbot_small-90M"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json|"facebook/blenderbot_small-90M"模型在开源社区上的vocab.jso的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt|"facebook/blenderbot_small-90M"模型在开源社区上的merges.tx的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json|"facebook/blenderbot_small-90M"模型在开源社区上的tokenizer_config.jso的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json|"facebook/blenderbot_small-90M"模型在开源社区上的vocab.jso的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt|"facebook/blenderbot_small-90M"模型在开源社区上的merges.tx的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py |https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json|"facebook/blenderbot_small-90M"模型在开源社区上的tokenizer_config.jso的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py |https://huggingface.co/camembert-base/resolve/main/config.json|"camembert-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py |https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1/resolve/main/config.json|"umberto-commoncrawl-cased-v1"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py |https://huggingface.co/Musixmatch/umberto-wikipedia-uncased-v1/resolve/main/config.json|"umberto-wikipedia-uncased-v1"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py |https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model|"camembert-base"模型在开源社区上的sentencepiece.bpe.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py |https://huggingface.co/camembert-base/resolve/main/tokenizer.json|"camembert-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py |https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model|"camembert-base"模型在开源社区上的sentencepiece.bpe.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/configuration_canine.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py |https://huggingface.co/google/canine-s/resolve/main/config.json|"google/canine-s"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/configuration_clip.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py |https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/config.json|"openai/clip-vit-base-patch32"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py |https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json|"openai/clip-vit-base-patch32"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py |https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt|"openai/clip-vit-base-patch32"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py |https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/tokenizer.json|"openai/clip-vit-base-patch32"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py |https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json|"openai/clip-vit-base-patch32"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py |https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt|"openai/clip-vit-base-patch32"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py |https://huggingface.co/YituTech/conv-bert-base/resolve/main/config.json|"YituTech/conv-bert-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py |https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/config.json|"YituTech/conv-bert-medium-small"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py |https://huggingface.co/YituTech/conv-bert-small/resolve/main/config.json|"YituTech/conv-bert-small"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py |https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt|"YituTech/conv-bert-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py |https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt|"YituTech/conv-bert-medium-small"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py |https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt|"YituTech/conv-bert-small"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py |https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt|"YituTech/conv-bert-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py |https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt|"YituTech/conv-bert-medium-small"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py |https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt|"YituTech/conv-bert-small"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py |https://huggingface.co/facebook/convnext-tiny-224/resolve/main/config.json|"facebook/convnext-tiny-224"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |http://images.cocodataset.org/val2017/000000039769.jpg|convert模型在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth|convert模型在开源社区上的预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py |https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth|convert模型在开源社区上的默认预训练权重配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py |https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model|"TsinghuaAI/CPM-Generate"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py |https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/tokenizer.json|"TsinghuaAI/CPM-Generate"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py |https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model|"TsinghuaAI/CPM-Generate"模型在开源社区上的spiece.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/configuration_ctrl.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py |https://huggingface.co/ctrl/resolve/main/config.json|CTRL_PRETRAINED_CONFIG_ARCHIVE_MAP = {"ctrl"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py |https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json|"vocab_file"模型在开源社区上的 "ht的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py |https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt|"merges_file"模型在开源社区上的 "ht的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_audio.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py |https://huggingface.co/facebook/data2vec-audio-base-960h/resolve/main/config.json|"facebook/data2vec-base-960h"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_text.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py |https://huggingface.co/data2vec/resolve/main/config.json|"facebook/data2vec-text-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py |https://huggingface.co/microsoft/deberta-base/resolve/main/config.json|"microsoft/deberta-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py |https://huggingface.co/microsoft/deberta-large/resolve/main/config.json|"microsoft/deberta-large"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py |https://huggingface.co/microsoft/deberta-xlarge/resolve/main/config.json|"microsoft/deberta-xlarge"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py |https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/config.json|"microsoft/deberta-base-mnli"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py |https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/config.json|"microsoft/deberta-large-mnli"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py |https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/config.json|"microsoft/deberta-xlarge-mnli"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json|"microsoft/deberta-base"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json|"microsoft/deberta-large"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json|"microsoft/deberta-xlarge"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json|"microsoft/deberta-base-mnli"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json|"microsoft/deberta-large-mnli"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json|"microsoft/deberta-xlarge-mnli"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt|"microsoft/deberta-base"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt|"microsoft/deberta-large"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt|"microsoft/deberta-xlarge"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt|"microsoft/deberta-base-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt|"microsoft/deberta-large-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py |https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt|"microsoft/deberta-xlarge-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json|"microsoft/deberta-base"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json|"microsoft/deberta-large"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json|"microsoft/deberta-xlarge"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json|"microsoft/deberta-base-mnli"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json|"microsoft/deberta-large-mnli"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json|"microsoft/deberta-xlarge-mnli"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt|"microsoft/deberta-base"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt|"microsoft/deberta-large"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt|"microsoft/deberta-xlarge"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt|"microsoft/deberta-base-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt|"microsoft/deberta-large-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py |https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt|"microsoft/deberta-xlarge-mnli"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/config.json|"microsoft/deberta-v2-xlarge"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/config.json|"microsoft/deberta-v2-xxlarge"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/config.json|"microsoft/deberta-v2-xlarge-mnli"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/config.json|"microsoft/deberta-v2-xxlarge-mnli"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/spm.model|"microsoft/deberta-v2-xlarge"模型在开源社区上的spm.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/spm.model|"microsoft/deberta-v2-xxlarge"模型在开源社区上的spm.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/spm.model|"microsoft/deberta-v2-xlarge-mnli"模型在开源社区上的spm.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py |https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/spm.model|"microsoft/deberta-v2-xxlarge-mnli"模型在开源社区上的spm.model的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/configuration_deit.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py |https://huggingface.co/facebook/deit-base-patch16-224/resolve/main/config.json|"facebook/deit-base-distilled-patch16-224"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/configuration_detr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py |https://huggingface.co/facebook/detr-resnet-50/resolve/main/config.json|"facebook/detr-resnet-50"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-uncased/resolve/main/config.json|"distilbert-base-uncased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/config.json|"distilbert-base-uncased-distilled-squad"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-cased/resolve/main/config.json|"distilbert-base-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/config.json|"distilbert-base-cased-distilled-squad"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-german-cased/resolve/main/config.json|"distilbert-base-german-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/config.json|"distilbert-base-multilingual-cased"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py |https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/config.json|"distilbert-base-uncased-finetuned-sst-2-english"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt|"distilbert-base-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt|"distilbert-base-uncased-distilled-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt|"distilbert-base-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt|"distilbert-base-cased-distilled-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt|"distilbert-base-german-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt|"distilbert-base-multilingual-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-uncased/resolve/main/tokenizer.json|"distilbert-base-uncased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/tokenizer.json|"distilbert-base-uncased-distilled-squad"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-cased/resolve/main/tokenizer.json|"distilbert-base-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/tokenizer.json|"distilbert-base-cased-distilled-squad"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-german-cased/resolve/main/tokenizer.json|"distilbert-base-german-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py |https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/tokenizer.json|"distilbert-base-multilingual-cased"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py |https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt|"distilbert-base-uncased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py |https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt|"distilbert-base-uncased-distilled-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py |https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt|"distilbert-base-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py |https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt|"distilbert-base-cased-distilled-squad"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py |https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt|"distilbert-base-german-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py |https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt|"distilbert-base-multilingual-cased"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py |https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/config.json|"facebook/dpr-ctx_encoder-single-nq-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py |https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/config.json|"facebook/dpr-question_encoder-single-nq-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py |https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/config.json|"facebook/dpr-reader-single-nq-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py |https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/config.json|"facebook/dpr-ctx_encoder-multiset-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py |https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/config.json|"facebook/dpr-question_encoder-multiset-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py |https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/config.json|"facebook/dpr-reader-multiset-base"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt|"facebook/dpr-ctx_encoder-single-nq-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt|"facebook/dpr-ctx_encoder-multiset-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json|"facebook/dpr-ctx_encoder-single-nq-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json|"facebook/dpr-ctx_encoder-multiset-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt|"facebook/dpr-question_encoder-single-nq-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt|"facebook/dpr-question_encoder-multiset-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json|"facebook/dpr-question_encoder-single-nq-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json|"facebook/dpr-question_encoder-multiset-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt|"facebook/dpr-reader-single-nq-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt|"facebook/dpr-reader-multiset-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json|"facebook/dpr-reader-single-nq-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py |https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json|"facebook/dpr-reader-multiset-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt|"facebook/dpr-ctx_encoder-single-nq-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt|"facebook/dpr-ctx_encoder-multiset-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json|"facebook/dpr-ctx_encoder-single-nq-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json|"facebook/dpr-ctx_encoder-multiset-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt|"facebook/dpr-question_encoder-single-nq-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt|"facebook/dpr-question_encoder-multiset-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json|"facebook/dpr-question_encoder-single-nq-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json|"facebook/dpr-question_encoder-multiset-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt|"facebook/dpr-reader-single-nq-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt|"facebook/dpr-reader-multiset-base"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json|"facebook/dpr-reader-single-nq-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py |https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json|"facebook/dpr-reader-multiset-base"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py |https://huggingface.co/google/electra-small-generator/resolve/main/config.json|"google/electra-small-generator"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py |https://huggingface.co/google/electra-base-generator/resolve/main/config.json|"google/electra-base-generator"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py |https://huggingface.co/google/electra-large-generator/resolve/main/config.json|"google/electra-large-generator"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py |https://huggingface.co/google/electra-small-discriminator/resolve/main/config.json|"google/electra-small-discriminator"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py |https://huggingface.co/google/electra-base-discriminator/resolve/main/config.json|"google/electra-base-discriminator"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py |https://huggingface.co/google/electra-large-discriminator/resolve/main/config.json|"google/electra-large-discriminator"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt|"google/electra-small-generator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt|"google/electra-base-generator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt|"google/electra-large-generator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt|"google/electra-small-discriminator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt|"google/electra-base-discriminator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt|"google/electra-large-discriminator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-small-generator/resolve/main/tokenizer.json|"google/electra-small-generator"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-base-generator/resolve/main/tokenizer.json|"google/electra-base-generator"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-large-generator/resolve/main/tokenizer.json|"google/electra-large-generator"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-small-discriminator/resolve/main/tokenizer.json|"google/electra-small-discriminator"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-base-discriminator/resolve/main/tokenizer.json|"google/electra-base-discriminator"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py |https://huggingface.co/google/electra-large-discriminator/resolve/main/tokenizer.json|"google/electra-large-discriminator"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py |https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt|"google/electra-small-generator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py |https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt|"google/electra-base-generator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py |https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt|"google/electra-large-generator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py |https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt|"google/electra-small-discriminator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py |https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt|"google/electra-base-discriminator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py| Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py |https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt|"google/electra-large-discriminator"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/config.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的config.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py| Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/clip/test_modeling_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/clip/test_modeling_tf_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg |图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg |图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py| Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg |目标检测pipeline在开源社区上的验证集输入引用链接| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://moon-staging.huggingface.co |测试huggingface的api可用| -| 开发引入 | / | Bert_Chinese_ID3433_for_PyTorch/url.ini |https://bogus |bogus音视频开源下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/master/|convert_readme_to_index函数功能model_list开源链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/|convert_readme_to_index函数功能model_list开源链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4|"CoLA"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8|"SST"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc|"MRPC"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5|"QQP"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5|"STS"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce|"MNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df|"SNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601|"QNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb|"RTE"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf|"WNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://storage.googleapis.com/mtl-sentence-representations.appspot.com |"diagnostic"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt |MRPC任务训练集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt |MRPC任务测试集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_deprecated.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://api.github.com/repos/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service.py |https://api.github.com/repos/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/master/model_doc|README中内容不符合特定条件时替换model list的链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/model_doc|README中内容不符合特定条件时替换model list的链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py| Bert_Chinese_ID3433_for_PyTorch/transformers/utils/update_metadata.py |https://github.com/huggingface/transformers/commit |commit消息提示metadata升级来源的开源链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/config.yml | Bert_Chinese_ID3433_for_PyTorch/WikiExtractor.py | https:// | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_table.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/update_metadata.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/update_metadata.py | https://github.com/huggingface/transformers/commit/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service_deprecated.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/notification_service.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://download.microsoft.com/download/D/4/6/D46FF87A-F6B9-4252-AA8B-3604ED519838/MSRParaphraseCorpus.msi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_table.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/check_table.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | Bert_Chinese_ID3433_for_PyTorch/transformers/utils/check_repo.py | https://github.com/huggingface/doc-builder | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/xlm_roberta_xl/test_modeling_xlm_roberta_xl.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/xlm/test_tokenization_xlm.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/wavlm/test_modeling_wavlm.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/hf-internal-testing/processor_with_lm/tree/main | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/wav2vec2/test_tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/wav2vec2/test_modeling_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/wav2vec2/test_modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_flax_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/14016 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/repo_utils/test_check_copies.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/albert.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/utils/test_model_card.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/utils/test_model_card.py | https://arxiv.org/pdf/1810.03993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/utils/test_add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/unispeech_sat/test_modeling_unispeech_sat.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/unispeech/test_modeling_unispeech.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/trainer/test_trainer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/trainer/test_trainer.py | https://github.com/huggingface/transformers/issues/12970 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/tokenization/test_tokenization_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/transformers/pull/12550 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/tokenization/test_tokenization_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/tokenizers/issues/537 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/test_modeling_tf_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/test_modeling_common.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/tapas/test_tokenization_tapas.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/tapas/test_modeling_tf_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/tapas/test_modeling_tf_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/tapas/test_modeling_tf_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/tapas/test_modeling_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/t5/test_modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/sew_d/test_modeling_sew_d.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/sew/test_modeling_sew.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/roberta/test_tokenization_roberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/roberta/test_modeling_roberta.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/reformer/test_tokenization_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/reformer/test_tokenization_reformer.py | https://github.com/huggingface/transformers/pull/11737#issuecomment-850769064 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/perceiver/test_modeling_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/reformer/test_modeling_reformer.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/realm/test_tokenization_realm.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/plbart/test_tokenization_plbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_zero_shot.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13846 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_zero_shot.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13381#issuecomment-912343499 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_token_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pipelines/test_pipelines_token_classification.py | https://github.com/huggingface/transformers/pull/4987 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/phobert/test_tokenization_phobert.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/perceiver/test_modeling_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_tokenization_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pegasus/test_tokenization_pegasus.py | https://github.com/google-research/bigbird/raw/master/bigbird/vocab/pegasus.model | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/pegasus/test_modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/openai/test_tokenization_openai.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/mbart50/test_tokenization_mbart50.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/mbart/test_tokenization_mbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/mbart/test_modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/marian/test_modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/m2m_100/test_tokenization_m2m_100.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/led/test_modeling_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/allenai/longformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/led/test_modeling_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/huggingface/transformers/pull/9278#issue-544709661 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/layoutxlm/test_processor_layoutxlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/layoutxlm/test_processor_layoutxlm.py | https://www.industrydocuments.ucsf.edu/docs/snbx0223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/layoutlmv2/test_tokenization_layoutlmv2.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/layoutlmv2/test_modeling_layoutlmv2.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/ibert/test_modeling_ibert.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/hubert/test_modeling_tf_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/hubert/test_modeling_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/pull/3572 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/gpt2/test_tokenization_gpt2.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/fsmt/test_tokenization_fsmt.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/fixtures/tests_samples/wiki_text/wiki_00 | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/fixtures/tests_samples/wiki_text/wiki_00 | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/encoder_decoder/test_modeling_tf_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/deit/test_modeling_deit.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/deepspeed/test_deepspeed.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/deepspeed/test_deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/deberta/test_tokenization_deberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_text.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_audio.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/ctrl/test_tokenization_ctrl.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/blenderbot_small/test_modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/blenderbot/test_modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/big_bird/test_tokenization_big_bird.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/big_bird/test_tokenization_big_bird.py | https://github.com/google-research/bigbird/blob/master/bigbird/vocab/gpt2.model?raw=true | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/bertweet/test_tokenization_bertweet.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/bert/test_tokenization_bert.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/barthez/test_tokenization_barthez.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/barthez/test_tokenization_barthez.py | https://github.com/huggingface/transformers/issues/11457 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/bart/test_modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/auto/test_tokenization_auto.py | Bert_Chinese_ID3433_for_PyTorch/transformers/tests/auto/test_tokenization_auto.py | https://github.com/huggingface/transformers/pull/13251 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/fx.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/55888 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/pt/multilingual.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/pt/multilingual.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/perf_train_gpu_many.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/facebookresearch/fairscale | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/deepspeed.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/microsoft/deepspeed | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/huggingface/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_pt_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_pt_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/pytorch/pytorch/issues/16266 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/tokenization_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://en.wikipedia.org/wiki/Trie | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/tokenization_big_bird.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/issues/1133 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/fastai/fastai/blob/master/tests/utils/text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/64789046/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/34333710/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/59041913/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/testing_utils.py | https://docs.python.org/3/library/asyncio-subprocess.html#asyncio.asyncio.subprocess.Process.wait | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/zero_shot_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_image_classification.py | https://huggingface.co/models?filter=zero-shot-image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/zero_shot_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_classification.py | https://huggingface.co/models?search=nli | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/token_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/token_classification.py | https://huggingface.co/models?filter=token-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=text2text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=summarization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=translation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/huggingface/transformers/issues/14033#issuecomment-948385227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/text_classification.py | https://huggingface.co/models?filter=text-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/table_question_answering.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/table_question_answering.py | https://huggingface.co/models?filter=table-question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/question_answering.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://huggingface.co/models?filter=question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/question_answering.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://github.com/facebookresearch/DrQA | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/object_detection.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/object_detection.py | https://huggingface.co/models?filter=object-detection | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/image_segmentation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/image_segmentation.py | https://huggingface.co/models?filter=image-segmentation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/image_classification.py | https://huggingface.co/models?filter=image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/fill_mask.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://github.com/huggingface/transformers/pull/10222 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/feature_extraction.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/conversational.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/conversational.py | https://huggingface.co/models?filter=conversational | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://huggingface.co/transformers/main_classes/pipelines.html#pipeline-batching | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/audio_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/audio_classification.py | https://huggingface.co/models?filter=audio-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1904.09237 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://github.com/OpenNMT/OpenNMT-tf/blob/master/opennmt/optimizers/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization.py | https://discuss.huggingface.co/t/t5-finetuning-tips/684/3 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/huggingface/transformers/blob/8395f14de6068012787d83989c3627c3df6a252b/src/transformers/optimization.py#L505 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/kernels/yoso/fast_lsh_cumulation_cuda.cu | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation_cuda.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/kernels/yoso/fast_lsh_cumulation.cu | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/yoso.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/convert_yoso_pytorch_to_pytorch.py | https://github.com/mlpen/YOSO | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlnet.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlnet.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/transformers/quickstart.html#using-the-past | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://github.com/zihangdai/xlnet/issues/41#issuecomment-505102587 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xl/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xxl/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/prophetnet.tokenizer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-prophetnet.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/modeling_xlm_prophetnet.py | https://huggingface.co/models?filter=xprophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/blob/master/tools/lowercase_and_remove_accent.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/normalise-romanian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/remove-diacritics.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kyt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/awesome-transformers.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/PyThaiNLP/pythainlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/chezou/Mykytea-python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kytea | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/fxsjy/jieba | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/tree/master/tools | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/tokenization_xglm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://github.com/pytorch/pytorch/issues/32590 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm/commit/b94ec76c36f02fb2b0bf0dcb0b8554a2185173cd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://github.com/bootphon/phonemizer#readme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/configuration_wav2vec2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/modeling_vit_mae.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://huggingface.co/models?filter=vit_mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mae.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://github.com/facebookresearch/mae | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/models?filter=vit-mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/issues/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/configuration_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/vit-base-patch16-224/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/google/vit-base-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-coco-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-coco-pre/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/vision-text-dual-encoder.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/vision-text-dual-encoder.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-12.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-10.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://huggingface.co/models?filter=vilt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/jnhwkim/ban-vqa/blob/master/train.py#L19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_1.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://github.com/dandelin/ViLT/releases/download/200k/vilt_200k_mlm_itm.ckpt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/configuration_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/configuration_vilt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/modeling_van.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://huggingface.co/models?filter=van | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/modeling_van.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://arxiv.org/abs/2106.13797 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/van.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://github.com/Visual-Attention-Network/VAN-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Tiny-original/resolve/main/van_tiny_754.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Small-original/resolve/main/van_small_811.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Base-original/resolve/main/van_base_828.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/convert_van_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Large-original/resolve/main/van_large_839.pth.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/configuration_van.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/Visual-Attention-Network/van-base/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/configuration_trocr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/microsoft/trocr-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/vocab.pkl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/corpus.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/dbe6a7a9ff1a364a8706bf5df58a1ca96d2fd9da/torch/nn/modules/adaptive.py#L138 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/adaptive.p | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/mem_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/transfo-xl.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/transfo-xl.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | https://stackoverflow.com/questions/2121874/python-pickling-after-changing-a-modules-directory/2121918#2121918 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/configuration_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-sqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wtq/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wikisql-supervised/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-tabfact/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/4908213eb4df7aa988573350278b44c4dbe3f71b/tapas/experiments/prediction_utils.py#L288 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/constants.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/text_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L414 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L293 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_annotation_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/tapas.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/modeling_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/modeling_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/run_task_main.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/hparam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/tree/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/tokenization_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/open_model_proposals/ADD_BIG_BIRD.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://medium.com/huggingface/from-tensorflow-to-pytorch-265f40ef2a28 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L1624 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/transformer_layers.py#L56 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/attention.py#L136 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mt5/modeling_mt5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L666 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/configuration_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/configuration_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/configuration_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/configuration_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-3b/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/t5/configuration_t5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-11b/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/modeling_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/configuration_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/modeling_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/configuration_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/configuration_squeezebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/configuration_squeezebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/squeezebert/configuration_squeezebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text_2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text_2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/configuration_sew_d.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/configuration_sew_d.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew/configuration_sew.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/modeling_segformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/configuration_segformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/modeling_segformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_utils.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/roberta.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/roberta.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/configuration_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/configuration_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/configuration_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/configuration_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/configuration_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/configuration_roberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/retribert/tokenization_retribert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/retribert/tokenization_retribert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/retribert/tokenization_retribert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/retribert/modeling_retribert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://huggingface.co/models?filter=retribert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/distilbert/configuration_distilbert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/configuration_retribert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/modeling_resnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://huggingface.co/models?filter=resnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/configuration_resnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/configuration_resnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/tokenization_rembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/tokenization_rembert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/tokenization_rembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/google/rembert/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/tokenization_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/tokenization_reformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/tokenization_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/reformer.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://huggingface.co/models?filter=reformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/1509.02897.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/2001.04451.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/lucidrains/reformer-pytorch/blob/master/reformer_pytorch/reversible.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/configuration_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/configuration_reformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-enwik8/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/tokenizer.jsont | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/retrieval_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/modeling_tf_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/pdf/2005.11401.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/modeling_tf_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://stackoverflow.com/questions/52129909/tensorflow-equivalent-of-torch-gather | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/quantization-qdqbert/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/NVIDIA/TensorRT/tree/master/tools/pytorch-quantization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert/configuration_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/prophetnet/tokenization_prophetnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/prophetnet.tokenizer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/prophetnet.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://huggingface.co/models?filter=prophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/prophetnet.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://github.com/microsoft/ProphetNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/prophetnet/configuration_prophetnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/modeling_poolformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/poolformer.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | https://github.com/sail-sg/poolformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/configuration_poolformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/modeling_poolformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/configuration_poolformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-c-cpp-defect-detection/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-cs-java/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-en_XX-java/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-go-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-clone-detection/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-cs/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-javascript-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-php-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-python-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-medium/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-small/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/tokenization_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-ruby-en_XX/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/phobert/tokenization_phobert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/phobert/tokenization_phobert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/phobert/tokenization_phobert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/bpe.codes | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/phobert/tokenization_phobert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/bpe.codes | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://discuss.pytorch.org/t/is-there-any-layer-like-tensorflows-space-to-depth-function/3487/15 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://gist.github.com/sumanmichael/4de9dee93f972d47c80c4ade8e149ea6 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/configuration_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/configuration_perceiver.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/configuration_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/configuration_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/tokenization_openai_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/tokenization_openai_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/tokenization_openai_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/tokenization_openai_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/tokenization_openai_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/openai-gpt.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/openai-gpt.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/configuration_openai.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/configuration_openai.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mt5.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mt5/configuration_mt5.py | https://huggingface.co/google/mt5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mpnet/tokenization_mpnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mpnet/tokenization_mpnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mpnet/tokenization_mpnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mpnet/configuration_mpnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://huggingface.co/models?filter=mobilebert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/modeling_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/pdf/2004.02984.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/configuration_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/configuration_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/mmbt/modeling_mmbt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://github.com/facebookresearch/mmbt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/entity_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/perf_train_gpu_one.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/huggingface/transformers/issues/13906 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/modeling_megatron_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://huggingface.co/models?filter=megatron_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/perf_train_gpu_one.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/configuration_megatron_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/nvidia/megatron-bert-uncased-345m | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart50/tokenization_mbart50.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart50/tokenization_mbart50_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart50/tokenization_mbart50.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/tokenization_mbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/tokenization_mbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/tokenization_mbart_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/tokenization_mbart_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/tokenization_mbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/tokenization_mbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mbart.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mbart/configuration_mbart.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mbart.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/translation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/modeling_maskformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/2107.06278 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/maskformer.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | https://github.com/facebookresearch/MaskFormer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/configuration_maskformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/maskformer-swin-base-ade/blob/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/modeling_maskformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/datasets/scene_parse_150 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/configuration_maskformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/microsoft/swin-base-patch4-window12-384-in22k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/source.spm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/target.spm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/modeling_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/modeling_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://en.wikipedia.org/wiki/Insular_Celtic_languages | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/tatoeba/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://cdn-datasets.huggingface.co/language_codes/iso-639-3.csv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://object.pouta.csc.fi/Tatoeba-MT-models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/configuration_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/sentencepiece.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/tokenizer_config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/lxmert/tokenization_lxmert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/lxmert/tokenization_lxmert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/lxmert/tokenization_lxmert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/tokenization_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/tokenization_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/tokenization_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/tokenization_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/tokenization_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/entity_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/tokenization_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/entity_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/modeling_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://huggingface.co/models?filter=luke | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/configuration_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/configuration_luke.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://arxiv.org/abs/2010.01057 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/tokenization_longformer_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/longformer.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/longformer.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/configuration_longformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/configuration_longformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/configuration_longformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/configuration_longformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/longformer/configuration_longformer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/tokenization_led_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/tokenization_led_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/tokenization_led_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/tokenization_led_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/tokenization_led_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/tokenization_led_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/modeling_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://github.com/huggingface/transformers/blob/ac3cb660cad283163f7c73cad511124e845ca388/src/transformers/models/bart/modeling_bart.py#L1153 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://guillaumejaume.github.io/FUNS | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/clovaai/co | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/tasks/document_question_answering.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://rrc.cvc.uab.es/?ch=17 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/tasks/document_question_answering.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://github.com/microsoft/unilm/blob/master/layoutlmft/layoutlmft/models/layoutlmv2/detectron2_config.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://guillaumejaume.github.io/FUNSD/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://rrc.cvc.uab.es/?ch=13 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/configuration_layoutlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/configuration_layoutlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://huggingface.co/models?filter=imagegpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/ibert/configuration_ibert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/ibert/configuration_ibert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/ibert/configuration_ibert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large-mnli/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07447 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/audio-classification/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/configuration_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/audio-classification/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/herbert/tokenization_herbert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/herbert/tokenization_herbert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/herbert/tokenization_herbert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/herbert/tokenization_herbert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/modeling_tf_gptj.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://huggingface.co/models?filter=gptj | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/codegen/modeling_codegen.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/configuration_gptj.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/configuration_gptj.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/models?filter=gpt_j | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/es/tasks/language_modeling.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/tokenization_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/configuration_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/configuration_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/configuration_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/configuration_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt2/configuration_gpt2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neox/configuration_gpt_neox.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/codegen/modeling_codegen.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neo/configuration_gpt_neo.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neox/configuration_gpt_neox.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neo/configuration_gpt_neo.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/configuration_funnel.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fsmt/tokenization_fsmt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-src.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fsmt/tokenization_fsmt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-tgt.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fsmt/tokenization_fsmt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://github.com/pytorch/fairseq/tree/master/examples/wmt19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fsmt/modeling_fsmt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://huggingface.co/models?filter=fsmt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/eval-facebook-wmt19.sh | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/tokenizer.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/google-research/google-research/blob/master/f_net/fourier.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/master/generated/torch.vmap.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://arxiv.org/abs/2105.03824 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/configuration_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/configuration_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/tokenization_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/merges.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/flaubert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/flaubert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/configuration_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/configuration_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/configuration_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/flaubert/configuration_flaubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/electra.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/electra.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/electra/configuration_electra.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/huggingface/transformers/commit/614fef1691edb806de976756d4948ecbcd0c0ca3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/distilbert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/google-research/bert | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/distilbert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/backbone.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/issues/108#issuecomment-650269223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/matcher.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/detr.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://giou.stanford.edu/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/misc.py#L306 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/image_processing_yolos.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/pytorch/pytorch/issues/50276 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/image_transforms.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/cocodataset/panopticapi/blob/master/panopticapi/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L33 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L50 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py#L258 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L218 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L241 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/configuration_detr.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://rwightman.github.io/pytorch-image-models/#load-a-pretrained-model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/modeling_deit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | https://github.com/facebookresearch/deit/blob/ab5715372db8c6cad5740714b2216d55aeae052e/datasets.py#L103 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/modeling_deit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/configuration_deit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-distilled-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://huggingface.co/models?filter=deberta-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/modeling_tf_deberta.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://huggingface.co/models?filter=DeBERTa | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/tokenization_deberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://huggingface.co/models?filter=data2vec-text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/data2vec.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/data2vec.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_audio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/configuration_data2vec_text.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/facebook/data2vec-text-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/ctrl.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/ctrl.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/cpm/tokenization_cpm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/cpm/tokenization_cpm.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convnextv2.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/facebookresearch/ConvNeXt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convbert/tokenization_convbert_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/tokenization_clip_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://github.com/huggingface/tokenizers/issues/872 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_tf_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_tf_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/configuration_clip.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/tokenization_canine.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/tokenization_canine.py | https://github.com/google-research/language/blob/master/language/canine/special_codepoints.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/google-research/big_transfer/blob/49afe42338b62af9fbe18f0258197a33ee578a6b/bit_tf2/models.py#L36-L38 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/canine.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/camembert/modeling_camembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/camembert/modeling_camembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/byt5/tokenization_byt5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | https://github.com/alexa/bort/blob/master/bort/bort.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot/configuration_blenderbot.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/modeling_big_bird.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/modeling_big_bird.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/tokenization_big_bird.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | cgpotts@stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | ewan@inf.ed.ac.uk | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://nltk.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/nltk/nltk/issues/2409 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://en.wikipedia.org/wiki/List_of_emoticons | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://gist.github.com/winzig/8894715 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | foo.na@example.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/scrapy/w3lib/blob/master/w3lib/html.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://en.wikipedia.org/wiki/ISO/IEC_8859-1#Similar_character_sets | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://pypi.org/project/fugashi/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/ipadic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/beit/modeling_beit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/upernet.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1807.10221 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/upernet/modeling_upernet.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1411.4038 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/convert_convnext_to_pytorch.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/beit/modeling_beit.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/albert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/albert/modeling_tf_albert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://github.com/google-research/albert/blob/master/modeling.py#L971-L993 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/albert.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/albert/configuration_albert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://arxiv.org/pdf/2001.08361.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/huggingface/transformers/pull/11471 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/tensorflow/blob/00fad90125b18b80fe054de1055770cfb8fe4ba3/tensorflow/python/keras/engine/network.py#L1339-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_pytorch_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://github.com/tensorflow/tensorflow/blob/ee16fcac960ae660e0e4496658a366e2f745e1f0/tensorflow/python/keras/engine/network.py#L1352-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/google/flax/issues/1261 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modelcard.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modelcard.py | https://arxiv.org/abs/1810.03993 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modelcard.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/integration_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://github.com/huggingface/transformers/issues/11565 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/integration_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://docs.wandb.ai/integrations/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/site/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://pypi.org/project/azureml-sdk/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://neptune.ai | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/image_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/image_utils.py | https://pytorch.org/vision/stable/transforms.html#torchvision.transforms.Resize | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/hf_argparser.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/hf_argparser.py | https://stackoverflow.com/questions/15008758/parsing-boolean-values-with-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/configuration_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/issues/14081 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/abs/1909.05858 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/logits_process.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/facebookresearch/ParlAI/blob/master/parlai/core/torch_generator_agent.py#L1350 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/ashwinkalyan/dbs/blob/master/dbs/beam_utils.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/pytorch/pytorch/blob/2289a12f21c54da93bf5d696e3f9aea83dd9c10d/torch/testing/_internal/common_cuda.py#L51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tqdm/tqdm/blob/master/tqdm/autonotebook.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/sentencepiece#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/protocolbuffers/protobuf/tree/master/python#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/faiss/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/flax | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rspeer/python-ftfy/tree/master#installing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/quantization-qdqbert/Dockerfile | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/tapas.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://hf.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/optuna/optuna/blob/master/optuna/integration/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/doc.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | http://stackoverflow.com/a/6528148/190597 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/deepspeed.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1394#issuecomment-937405374 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/deepspeed/test_deepspeed.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/data/processors/xnli.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/data/processors/xnli.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/run_classifier.py#L207 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/data/processors/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/data/metrics/__init__.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/data/datasets/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/convert_slow_tokenizer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/convert_graph_to_onnx.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/convert_graph_to_onnx.py | https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/user.py | https://git-lfs.github.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/commands/lfs.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/lfs.py | https://github.com/git-lfs/git-lfs/blob/master/docs/custom-transfers.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pythonprofilers/memory_profiler/blob/895c4ac7a08020d66ae001e24067da6dcea42451/memory_profiler.py#L239 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://psutil.readthedocs.io/en/latest/#psutil.Process.memory_info | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/tensorflow/tensorflow/issues/20218#issuecomment-416771802 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pytorch/xla/issues/2180 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_tf.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://github.com/NVIDIA/apex/issues/439 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.0841 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1612.08083 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1702.03118 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1710.05941v1 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/setup.py | https://test.pypi.org/legacy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/setup.py | https://testpypi.python.org/pypi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/setup.py | https://github.com/pypa/pip/issues/5466 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh | https://huggingface.co/Helsinki-NLP/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/stale.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/pegasus/build_test_sample_spm_no_bos.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/pegasus/build_test_sample_spm_no_bos.py | https://raw.githubusercontent.com/google/sentencepiece/master/data/botchan.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-ru | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-ru-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-de-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-big | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-6-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://www.statmt.org/wmt16/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://matrix.statmt.org/test_sets/newstest2016.tgz?1504722372 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/eval-facebook-wmt19.sh | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | https://arxiv.org/abs/1912.08777 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/seq2seq-distillation/_test_bash_script.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/_test_bash_script.py | https://cdn-datasets.huggingface.co/translation/wmt_en_ro-tr40k-va0.5k-te0.5k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/test_distributed_retriever.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/test_distributed_retriever.py | https://stackoverflow.com/questions/54338013/parallel-import-a-python-file-from-sibling-folder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/test_distributed_retriever.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/test_distributed_retriever.py | port=12345 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/performer/modeling_flax_performer_utils.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer_utils.py | https://github.com/google-research/google-research/blob/master/performer/fast_self_attention/fast_self_attention.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/performer/modeling_flax_performer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | https://msdata.visualstudio.com/Vienna/_workitems/edit/1486599 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/arunmallya/piggyback | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/allenai/hidden-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/NervanaSystems/distiller/blob/2291fdcc2ea642a98d4e20629acb5a9e2e04b4e6/distiller/pruning/automated_gradual_pruner.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://yjernite.github.io/lfqa.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://research.google/pubs/pub49029/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://arxiv.org/abs/1907.09190 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://en.wikipedia.org/wiki/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/model_parallel/partitions.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/partitions.py | https://github.com/google-research/google-research/blob/master/flax_models/t5x/partitions.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/fsner/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/fsner/src/fsner/model.py | https://arxiv.org/abs/2008.10570 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/fsner/setup.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/utils.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/train.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/lm_seqs_dataset.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/grouped_batch_sampler.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/grouped_batch_sampler.py | https://github.com/pytorch/vision/blob/master/references/detection/group_by_aspect_ratio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/model/net.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/issues/2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/deebert/test_glue_deebert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/deebert/test_glue_deebert.py | https://github.com/huggingface/transformers/issues/10560 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/scripts/human_eval.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/codeparrot/scripts/human_eval.py | https://stackoverflow.com/questions/60804599/python-multiprocessing-keeps-spawning-the-whole-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://github.com/huggingface/transformers/blob/783d7d2629e97c5f0c5f9ef01b8c66410275c204/examples/research_projects/bertology/run_bertology.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_bertology.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://github.com/pmichel31415/are-16-heads-really-better-than-1 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://cs.nyu.edu/~kcho/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/abisee/cnn-dailymail/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/nlpyang/PreSumm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/xla_spawn.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-classification/run_xnli.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mim.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mim.py | https://github.com/microsoft/SimMIM/blob/main/data/data_simmim.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://arxiv.org/abs/2111.06377 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mae.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://github.com/facebookresearch/mae/blob/main/main_pretrain.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.sh | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/token-classification/run.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/xla_spawn.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/seq2seq/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/eval.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_swag.py | https://github.com/google-research/bert/issues/38 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_swag.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_swag.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_swag.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_openai_gpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/huggingface/pytorch-openai-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_openai_gpt.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/openai/finetune-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_camembert.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/run_camembert.py | https://github.com/pytorch/fairseq/blob/master/fairseq/models/roberta/hub_interface.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.sh | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/huggingface/transformers/issues/3159 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/legacy/pytorch-lightning/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/vision/run_image_classification.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/t5_tokenizer_model.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/t5_tokenizer_model.py | https://github.com/yandex-research/DeDLOC/blob/main/sahajbert/tokenizer/tokenizer_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2466 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://arxiv.org/pdf/1910.10683.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/master/t5/data/preprocessors.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docker/transformers-pytorch-tpu/Dockerfile | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://github.com/conda/conda/issues/8385 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | Bert_Chinese_ID3433_for_PyTorch/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | Bert_Chinese_ID3433_for_PyTorch/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | Bert_Chinese_ID3433_for_PyTorch/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/examples/flax/vision/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/big_bird/requirements.txt | https://github.com/huggingface/transformers@master | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt | https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | -| 开发引入 | / |Bert_Chinese_ID3433_for_PyTorch/transformers/tests/sagemaker/scripts/tensorflow/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | user.email配置邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/.circleci/config.yml | https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/CITATION.cff | https://www.aclweb.org/anthology/2020.emnlp-demos.6 | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torch-$(python -c "from torch import version; print(version.__version__.split(''+'')[0])")+cpu.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://pypi.ngc.nvidia.com | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1914?run_id=6724 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/setup.py | thomas@huggingface.co | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/{experiment.id} | experiment地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_benchmark.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_for_a1.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert_for_a1.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07447 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 开发者邮箱配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | glue数据集diagnostic链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2498860800&Signature=DuQ2CSPt2Yfre0C%2BiISrVYrIFaZH1Lc7hBVZDD4ZyR7fZYOMNOUGpi8QxBmTNOrNPjR3z1cggo7WXFfrgECP6FBJSsURv8Ybrue8Ypt%2FTPxbuJ0Xc2FhDi%2BarnecCBFO77RSbfuz%2Bs95hRrYhTnByqu3U%2FYZPaj3tZt5QdfpH2IUROY8LiBXoXS46LE%2FgOQc%2FKN%2BA9SoscRDYsnxHfG0IjXGwHN%2Bf88q6hOmAxeNPx6moDulUF6XMUAaXCSFU%2BnRO2RDL9CapWxj%2BDl7syNyHhB7987hZ80B%2FwFkQ3MEs8auvt5XW1%2Bd4aCU7ytgM69r8JDCwibfhZxpaa4gd50QXQ%3D%3D | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/url.ini | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Bert_Chinese_ID3433_for_PyTorch/url.ini | https://scikit-learn.org/stable/modules/generated/sklearn.metrics.accuracy_score.html | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/public_address_statement.md index 08da3bf4ebcba9acb85ee5329814dcee36a96c2c..60ad5d9d349b6066283e4d560b1ef4bd23e13429 100644 --- a/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/public_address_statement.md @@ -1,35 +1,17 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt|'bert-base-uncased'模型在开源社区上的bert-base-uncased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt|'bert-large-uncased'模型在开源社区上的bert-large-uncased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt|'bert-base-cased'模型在开源社区上的bert-base-cased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt|'bert-large-cased'模型在开源社区上的bert-large-cased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt|'bert-base-multilingual-uncased'模型在开源社区上的bert-base-multilingual-uncased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt|'bert-base-multilingual-cased'模型在开源社区上的bert-base-multilingual-cased-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/tokenization.py |CPM_Finetune_for_PyTorch/data_utils/wordpiece.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt|'bert-base-chinese'模型在开源社区上的bert-base-chinese-vocab.txt的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz|'bert-base-uncased'模型在开源社区上的bert-base-uncased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz|'bert-large-uncased'模型在开源社区上的bert-large-uncased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz|'bert-base-cased'模型在开源社区上的bert-base-cased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz|'bert-large-cased'模型在开源社区上的bert-large-cased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz|'bert-base-multilingual-uncased'模型在开源社区上的bert-base-multilingual-uncased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz|'bert-base-multilingual-cased'模型在开源社区上的bert-base-multilingual-cased.tar.gz的下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BERT/modeling.py |CPM_Finetune_for_PyTorch/model/modeling.py |https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz|'bert-base-chinese'模型在开源社区上的bert-base-chinese.tar.g的下载链接| -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16.py|CPM_Finetune_for_PyTorch/mpu/random.py | https://github.com/pytorch/pytor | 源码实现 | -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16.py|CPM_Finetune_for_PyTorch/mpu/layers.py | https://github.com/pytorch/pytor | 源码实现 | -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16.py|CPM_Finetune_for_PyTorch/mpu/grads.py | https://github.com/pytorch/pytor | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/model/modeling.py | https://www.tensorflow.org/instal | 模型相关说明 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/model/modeling.py | https://www.github.com/nvidia/ap | 模型相关说明 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/model/modeling.py | https://github.com/pytorch/pytorch/pull/56 | 源码实现 | -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16util.py|CPM_Finetune_for_PyTorch/fp16/fp16util.py | http://on-demand.gputechconf.com/gtc/2018/video/S8101 | 模型相关说明 | -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16util.py|CPM_Finetune_for_PyTorch/fp16/fp16util.py | http://pytorch.org/docs/master/_modules/torch/_utils.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16.py|CPM_Finetune_for_PyTorch/fp16/fp16.py | https://github.com/pytorch/pytorch/issues/77 | 模型相关说明 | -| 开源代码引入 | https://github.com/TsinghuaAI/CPM-2-Finetune/blob/main/fp16/fp16.py|CPM_Finetune_for_PyTorch/fp16/fp16.py | http://pytorch.org/docs/master/optim.html#optimizer-step-closu | 模型相关说明 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://github.com/huggingface/pytorch-pretrained-BERT/blob/master/pytorch_pretrained_bert/tokenization. | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideograph | 模型相关说明 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/file_utils.py | https://github.com/huggingface/pytorch-pretrained-BE | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/file_utils.py | https://github.com/allenai/allenn | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/datasets.py | https://github.com/google-research/bert/blob/master/create_pretraining_data.py#L248-L2 | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/datasets.py | https://github.com/google-research/bert/blob/master/create_pretraining_data.py#L3 | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/datasets.py | https://arxiv.org/pdf/1810.04805.p | 模型相关说明 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data_utils/datasets.py | https://github.com/google-research/bert/blob/master/create_pretraining_data.py#L3 | 源码实现 | -| 开发引入 | / | CPM_Finetune_for_PyTorch/data/dataset_utils.py | https://github.com/google-research/albert/blob/master/create_pretraining_data. | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/data_utils/wordpiece.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s5.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/CPM_Finetune_for_PyTorch/model/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Data2vec_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Data2vec_for_PyTorch/public_address_statement.md index d6d4146d995cac91819bcf81a4bdf6011accebf6..c960c75127537bafc54e54a9b7f73bbe4b4fe8ce 100644 --- a/PyTorch/built-in/nlp/Data2vec_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Data2vec_for_PyTorch/public_address_statement.md @@ -1,239 +1,104 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.gitmodules|Data2vec_for_PyTorch/.gitmodules | https://github.com/ngoyal2707/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|Data2vec_for_PyTorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|Data2vec_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|Data2vec_for_PyTorch/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.github/ISSUE_TEMPLATE.md|Data2vec_for_PyTorch/setup.py | https://github.com/pytorch/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/feature_transforms/specaugment.py|Data2vec_for_PyTorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/models/speech_dlm/modules/speech_dlm_decoder.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/models/speech_dlm/modules/speech_dlm_decoder_layer.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py|Data2vec_for_PyTorch/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/convolution.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/modules/convolution.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/emformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/convolution.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/modules/convolution.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/augmented_memory_attention.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.08042 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/augmented_memory_attention.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.09137 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py|Data2vec_for_PyTorch/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | https://discuss.pytorch.org/t/how-to-mask-and-assign-a-value-to-tensor/18437 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/emformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/2005.09684 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Data2vec_for_PyTorch/docs/conf.py | https://github.com/pytorch/fairseq/tree/main/docs/ | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Data2vec_for_PyTorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Data2vec_for_PyTorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Data2vec_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/file_utils.py|Data2vec_for_PyTorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.md|Data2vec_for_PyTorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/checkpoint_utils.py|Data2vec_for_PyTorch/fairseq/checkpoint_utils.py | https://pypi.org/project/huggingface-hub/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md|Data2vec_for_PyTorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md|Data2vec_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md|Data2vec_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py|Data2vec_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/search.py|Data2vec_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/trainer.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/trainer.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/unsupervised/w2vu_generate.py|Data2vec_for_PyTorch/fairseq_cli/hydra_train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/unsupervised/w2vu_generate.py|Data2vec_for_PyTorch/fairseq_cli/hydra_validate.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/unsupervised/w2vu_generate.py|Data2vec_for_PyTorch/fairseq_cli/train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py|Data2vec_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py|Data2vec_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py|Data2vec_for_PyTorch/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/indexed_dataset.py|Data2vec_for_PyTorch/tests/test_token_block_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh|Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/criterions/adaptive_loss.py|Data2vec_for_PyTorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/criterions/speech_dlm_criterion.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/criterions/tacotron2_loss.py|Data2vec_for_PyTorch/fairseq/criterions/tacotron2_loss.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/indexed_dataset.py|Data2vec_for_PyTorch/fairseq/data/indexed_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/README.md|Data2vec_for_PyTorch/fairseq/data/data_utils.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/README.md|Data2vec_for_PyTorch/fairseq/data/mask_tokens_dataset.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/data/speech_dlm_dataset.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/span_mask_tokens_dataset.py|Data2vec_for_PyTorch/fairseq/data/span_mask_tokens_dataset.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/constants.py|Data2vec_for_PyTorch/fairseq/dataclass/constants.py | https://github.com/facebookresearch/hydra/issues/1156 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/configs.py|Data2vec_for_PyTorch/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/utils.py|Data2vec_for_PyTorch/fairseq/dataclass/utils.py | https://github.com/omry/omegaconf/pull/911 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/configs.py|Data2vec_for_PyTorch/fairseq/dataclass/configs.py | https://github.com/facebookresearch/hydra/issues/1117 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fairseq_incremental_decoder.py|Data2vec_for_PyTorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md|Data2vec_for_PyTorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/getting_started.rst|Data2vec_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md|Data2vec_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md|Data2vec_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|Data2vec_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|Data2vec_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/stories/README.md|Data2vec_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/criterions/adaptive_loss.py|Data2vec_for_PyTorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/character_token_embedder.py|Data2vec_for_PyTorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_conformer.py|Data2vec_for_PyTorch/fairseq/modules/conformer_layer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md|Data2vec_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/dynamic_crf_layer.py|Data2vec_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/truncated_bptt/README.md|Data2vec_for_PyTorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/truncated_bptt/README.md|Data2vec_for_PyTorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/gelu.py|Data2vec_for_PyTorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/truncated_bptt/README.md|Data2vec_for_PyTorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md|Data2vec_for_PyTorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/location_attention.py|Data2vec_for_PyTorch/fairseq/modules/location_attention.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/lstm_cell_with_zoneout.py|Data2vec_for_PyTorch/fairseq/modules/lstm_cell_with_zoneout.py | https://arxiv.org/abs/1606.01305 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/truncated_bptt/README.md|Data2vec_for_PyTorch/fairseq/modules/positional_encoding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/rotary_positional_embedding.py|Data2vec_for_PyTorch/fairseq/modules/rotary_positional_embedding.py | https://blog.eleuther.ai/rotary-embeddings/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/rotary_positional_embedding.py|Data2vec_for_PyTorch/fairseq/modules/rotary_positional_embedding.py | https://arxiv.org/pdf/2104.09864.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/sparse_multihead_attention.py|Data2vec_for_PyTorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/vggblock.py|Data2vec_for_PyTorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adafactor.py|Data2vec_for_PyTorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|Data2vec_for_PyTorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|Data2vec_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|Data2vec_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|Data2vec_for_PyTorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/bmuf.py|Data2vec_for_PyTorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|Data2vec_for_PyTorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|Data2vec_for_PyTorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md|Data2vec_for_PyTorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/cross_lingual_language_model/README.md|Data2vec_for_PyTorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/fairseq_task.py|Data2vec_for_PyTorch/fairseq/tasks/fairseq_task.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/fairseq_task.py|Data2vec_for_PyTorch/fairseq/tasks/fairseq_task.py | https://github.com/facebookresearch/GENRE | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/tasks/speech_dlm_task.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md|Data2vec_for_PyTorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/translation_lev.py|Data2vec_for_PyTorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_synthesis/utils.py|Data2vec_for_PyTorch/fairseq/tasks/text_to_speech.py | https://arxiv.org/pdf/2011.03568.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/hydra_plugins/dependency_submitit_launcher/setup.py|Data2vec_for_PyTorch/hydra_plugins/dependency_submitit_launcher/setup.py | abaevski@fb.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/getting_started.rst|Data2vec_for_PyTorch/tests/speech/test_convtransformer_simul_trans.py | https://dl.fbaipublicfiles.com/fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_text_joint_to_text/docs/pre-training.md|Data2vec_for_PyTorch/tests/speech/test_dual_input_wav_transformer.py | https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/acl2022/librispeech/finetuned | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/getting_started.rst|Data2vec_for_PyTorch/tests/speech/test_s2s_transformer.py | https://dl.fbaipublicfiles.com/fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/getting_started.rst|Data2vec_for_PyTorch/tests/speech/test_wav2vec2.py | https://dl.fbaipublicfiles.com/fairseq | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/getting_started.rst|Data2vec_for_PyTorch/tests/speech/__init__.py | https://dl.fbaipublicfiles.com/fairseq | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_text_joint_to_text/docs/ende-mustc.md|Data2vec_for_PyTorch/tests/speech/__init__.py | https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/must_c/en_de | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech_recognition/asr_test_base.py|Data2vec_for_PyTorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech_recognition/asr_test_base.py|Data2vec_for_PyTorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/data2vec_image_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/data2vec_image_classification.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/audio_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/audio_classification.py | https://github.com/mil-tokyo/bc_learning_sound/blob/master/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/data2vec_image_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/data2vec_text_classification.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/data2vec_image_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/data2vec_vision.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/data2vec_image_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/mae.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/data2vec_image_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/mae_image_classification.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/data2vec/models/mae_image_classification.py|Data2vec_for_PyTorch/examples/data2vec/models/mae_image_classification.py | https://github.com/microsoft/unilm/blob/master/beit/optim_factory.py#L33 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libbase/balanced_assignment.cpp|Data2vec_for_PyTorch/fairseq/clib/libbase/balanced_assignment.cpp | https://dspace.mit.edu/bitstream/handle/1721.1/3265/P-2108-26912652.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libbase/balanced_assignment.cpp|Data2vec_for_PyTorch/fairseq/clib/libbase/balanced_assignment.cpp | https://github.com/bkj/auction-lap | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libnat_cuda/binding.cpp|Data2vec_for_PyTorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/speech_to_text_dataset.py|Data2vec_for_PyTorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|Data2vec_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe_utils.py|Data2vec_for_PyTorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/hub_interface.py|Data2vec_for_PyTorch/fairseq/models/bart/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/bart | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/ema/ema.py|Data2vec_for_PyTorch/fairseq/models/ema/ema.py | https://github.com/zhawe01/fairseq-gec.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/ema/ema.py|Data2vec_for_PyTorch/fairseq/models/ema/ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md|Data2vec_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/roberta | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py|Data2vec_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/gottbert/README.md|Data2vec_for_PyTorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|Data2vec_for_PyTorch/fairseq/models/roberta/model.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Data2vec_for_PyTorch/fairseq/models/speech_dlm/speech_dlm.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/convtransformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/convtransformer.py | https://arxiv.org/abs/2004.10234 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_speech/docs/direct_s2st_discrete_units.md|Data2vec_for_PyTorch/fairseq/models/speech_to_speech/s2s_transformer.py | https://arxiv.org/abs/2107.05604 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_conformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/s2t_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_transformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md|Data2vec_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_transformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_transformer.py|Data2vec_for_PyTorch/fairseq/models/speech_to_text/xm_transformer_unity.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_synthesis/docs/ljspeech_example.md|Data2vec_for_PyTorch/fairseq/models/text_to_speech/fastspeech2.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_transformer.py|Data2vec_for_PyTorch/fairseq/models/text_to_speech/fastspeech2.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_base.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/text_to_speech/tacotron2.py|Data2vec_for_PyTorch/fairseq/models/text_to_speech/tacotron2.py | https://arxiv.org/pdf/1712.05884.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/text_to_speech/tts_transformer.py|Data2vec_for_PyTorch/fairseq/models/text_to_speech/tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/s2t_transformer.py|Data2vec_for_PyTorch/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/text_to_speech/vocoder.py|Data2vec_for_PyTorch/fairseq/models/text_to_speech/vocoder.py | http://dl.fbaipublicfiles.com/fairseq/vocoder | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/flores101/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/flores101/README.md|Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/wav2vec/utils.py|Data2vec_for_PyTorch/fairseq/models/wav2vec/utils.py | https://github.com/lucidrains/local-attention/blob/master/local_attention/local_attention.py#L41 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/xmod/README.md|Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md|Data2vec_for_PyTorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md|Data2vec_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md|Data2vec_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py|Data2vec_for_PyTorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py|Data2vec_for_PyTorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py|Data2vec_for_PyTorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s5.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt21.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/speech_to_text/xm_transformer_unity.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/text_to_speech/fastspeech2.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/text_to_speech/vocoder.py | http://dl.fbaipublicfiles.com/fairseq/vocoder | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/hydra_plugins/dependency_submitit_launcher/setup.py | abaevski@fb.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Data2vec_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Deltalm_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Deltalm_for_PyTorch/public_address_statement.md index 0bfc21d07f8f7ef41bb66a50cf1a65e1aa5c8113..9b4e7d9f207bbab65374a28c613a34f300161eb7 100644 --- a/PyTorch/built-in/nlp/Deltalm_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Deltalm_for_PyTorch/public_address_statement.md @@ -1,475 +1,246 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/getting_started.rst| Deltalm_for_PyTorch/fairseq/examples/criss/unsupervised_mt/eval.sh | https://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/latent_depth/latent_depth_src/multilingual_translation_latent_depth.py| Deltalm_for_PyTorch/fairseq/examples/latent_depth/latent_depth_src/multilingual_translation_latent_depth.py | https://arxiv.org/pdf/2009.13102.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/tokenizers/README.md| Deltalm_for_PyTorch/fairseq/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/tokenizers/README.md| Deltalm_for_PyTorch/fairseq/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/README.md| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/binarize.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/binarize.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/binarize.py | https://dl.fbaipublicfiles.com/fairseq/models/mbart50/sentence.bpe.model | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/binarize.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/binarize.py | https://dl.fbaipublicfiles.com/fairseq/models/mbart50/dict_250k.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_iitb.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_iitb.sh | http://www.cfilt.iitb.ac.in/~moses/iitb_en_hi_parallel/iitb_corpus_download/parallel.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-GNOME/v1/moses/en-si.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_iitb.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_iitb.sh | http://www.cfilt.iitb.ac.in/~moses/iitb_en_hi_parallel/iitb_corpus_download/dev_test.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/moses/en-si.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-KDE4/v2/moses/en-si.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-OpenSubtitles/v2018/moses/en-si.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-GNOME/v1/moses/en-ne.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/moses/en-ne.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://object.pouta.csc.fi/OPUS-KDE4/v2/moses/en-ne.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-Tatoeba/v20190709/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-wikimedia/v20190628/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_iwslt_and_extract.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-memat/v1/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-bible-uedin/v1/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-GNOME/v1/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-XhosaNavy/v1/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-KDE4/v2/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/tmx/en-xh.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_lotus.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_lotus.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/indic-multilingual/indic_languages_corpus.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/byte_level_bpe/get_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_iwslt_and_extract.sh | https://wit3.fbk.eu/archive/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-Tatoeba/v20190709/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-bible-uedin/v1/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-GNOME/v1/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | http://www.casmacat.eu/corpus/global-voices/globalvoices.ne-en.xliff.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-QED/v2.0a/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-KDE4/v2/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-OpenSubtitles/v2018/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-SPC/v1/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_af_xh.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_af_xh.sh | https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/tmx/af-en.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://github.com/christos-c/bible-corpus-tools.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://github.com/christos-c/bible-corpus/archive/v1.2.1.tar.gz | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | http://www.cle.org.pk/Downloads/ling_resources/parallelcorpus/NepaliTaggedCorpus.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://dl.fbaipublicfiles.com/fairseq/data/nepali-penn-treebank.en.patch | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://dl.fbaipublicfiles.com/fairseq/data/nepali-penn-treebank.ne.patch | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wat19_my.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wat19_my.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_flores_data.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | http://www.seas.upenn.edu/~nlp/resources/TACL-data-release/dictionaries.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/criss/download_and_preprocess_flores_test.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://dl.fbaipublicfiles.com/fasttext/supervised-models/lid.176.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/README.md| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/preprocess_ML50_v1.sh | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://wit3.fbk.eu/archive/2017-01-trnted//texts/en/ja/en-ja.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.kecl.ntt.co.jp/icl/lirg/jparacrawl/release/2.0/bitext/en-ja.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-ja.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ja-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ja.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.phontron.com/kftt/download/kftt-data-1.0.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ta-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ta.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/pmindia/v1/parallel/pmindia.v1.ta-en.tsv | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://object.pouta.csc.fi/OPUS-Tanzil/v1/moses/en-ta.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://preon.iiit.ac.in/~jerin/resources/datasets/pib-v0.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://preon.iiit.ac.in/~jerin/resources/datasets/mkb-v0.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://ufal.mff.cuni.cz/~ramasamy/parallel/data/v2/en-ta-parallel-v2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/README.md| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_ted_and_extract.py | http://phontron.com/data/ted_talks.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://raw.githubusercontent.com/nlpc-uom/English-Tamil-Parallel-Corpus/master/En-Ta%20Corpus/En-Ta%20English.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://github.com/nlpc-uom/English-Tamil-Parallel-Corpus/raw/master/En-Ta%20Corpus/En-Ta%20Tamil.txt | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://nrc-digital-repository.canada.ca/eng/view/dataset/?id=c7e34fa7-7629-43c2-bd6d-19b32bf64f60 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.iu-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/wmt20-sent.en-km.xz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/km-parallel.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/wmt20-sent.en-ps.xz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ps-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/ps-parallel.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.statmt.org/europarl/v10/training/europarl-v10.cs-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://s3.amazonaws.com/web-language-models/paracrawl/release5.1/en-cs.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/s2t_transformer.py| Deltalm_for_PyTorch/fairseq/examples/pointer_generator/pointer_generator_src/transformer_pg.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.cs-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/pointer_generator/pointer_generator_src/transformer_pg.py| Deltalm_for_PyTorch/fairseq/examples/pointer_generator/pointer_generator_src/transformer_pg.py | https://arxiv.org/abs/1704.04368 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.cs-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/rapid/RAPID_2019.cs-en.xlf.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.cs-en.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://dl.fbaipublicfiles.com/fasttext/supervised-models/lid.176.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.statmt.org/europarl/v10/training/europarl-v10.de-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://s3.amazonaws.com/web-language-models/paracrawl/release5.1/en-de.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.de-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.de-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/rapid/RAPID_2019.de-en.xlf.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.de-en.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://github.com/amake/TMX2Corpus | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.statmt.org/europarl/v10/training/europarl-v10.pl-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://s3.amazonaws.com/web-language-models/paracrawl/release5.1/en-pl.txt.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.pl-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://tilde-model.s3-eu-west-1.amazonaws.com/rapid2019.en-pl.tmx.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-pl.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufallab.ms.mff.cuni.cz/~bojar/czeng16-data/data-plaintext-format | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-ru.zipporah0-dedup-clean.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-ru.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufal.mff.cuni.cz/czeng/czeng16 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ru-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufal.mff.cuni.cz/czeng/download.php?f=convert_czeng16_to_17.pl.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.00 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.01 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.02 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ru.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufal.mff.cuni.cz/czeng/download.php?f=convert_czeng16_to_17.pl.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-zh.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.zh-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt11/normalize-punctuation.perl | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.00 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.01 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-zh.langid.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-nc-v8.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/test.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt20.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt14/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://github.com/amake/tmx2corpus.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt16/translation-task/training-parallel-ep-v8.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://opus.nlpl.eu/download.php?f=SETIMES/v2/tmx/en-ro.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt16/translation-task/dev-romanian-updated.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt16/translation-task/test.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh| Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh| Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh| Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/bart/README.summarization.md| Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASIA2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2011.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2017.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/NEU2017.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/training-parallel-ep-v8.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt15/wiki-titles.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://opus.nlpl.eu/download.php?f=SETIMES/v2/tmx/en-tr.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/leta.v1.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/dcep.lv-en.v1.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/books.lv-en.v1.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.00 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.01 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASIA2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2011.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2015.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2017.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/NEU2017.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/test-update-1.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/test.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufal.mff.cuni.cz/czeng/czeng16 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufallab.ms.mff.cuni.cz/~bojar/czeng16-data/data-plaintext-format | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/training-parallel-ep-v8.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-cs.zipporah0-dedup-clean.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-et.zipporah0-dedup-clean.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://ufallab.ms.mff.cuni.cz/~bojar/czeng16-data/data-plaintext-format | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/test.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://translate.yandex.ru/corpus?lang=en | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://translate.yandex.ru/corpus?lang=en | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/europarl/v9/training/europarl-v9.lt-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release3/en-lt.bicleaner07.tmx.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-ru.zipporah0-dedup-clean.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/news-commentary/v14/training/news-commentary-v14-wmt19.en-kk.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/news-commentary/v14/training/news-commentary-v14.en-ru.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.kk-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.ru-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.kk-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.lt-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.gu-en.tsv.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.00 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.01 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.02 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://tilde-model.s3-eu-west-1.amazonaws.com/rapid2016.en-lt.tmx.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://translate.yandex.ru/corpus?lang=en | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt19/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt19/translation-task/test.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/simultaneous_translation/models/convtransformer_simul_trans.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/models/convtransformer_simul_trans.py | https://www.aclweb.org/anthology/2020.aacl-main.58.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/simultaneous_translation/tests/test_text_models.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/tests/test_text_models.py | https://arxiv.org/pdf/1704.00784.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/utils/functions.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/functions.py | https://arxiv.org/pdf/1712.05382.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/simultaneous_translation/tests/test_text_models.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/tests/test_text_models.py | https://arxiv.org/pdf/1906.05218.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/simultaneous_translation/tests/test_text_models.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/tests/test_text_models.py | https://arxiv.org/abs/1712.05382 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/utils/latency.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/latency.py | https://arxiv.org/abs/1606.02012 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/docs/baseline.md| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/modules/monotonic_multihead_attention.py | https://www.aclweb.org/anthology/P19-1289/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/simultaneous_translation/tests/test_text_models.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/monotonic_attention.py | https://arxiv.org/pdf/1704.00784.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/utils/latency.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/latency.py | https://arxiv.org/abs/1810.08398 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/utils/latency.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/latency.py | https://arxiv.org/abs/1906.05218 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/simultaneous_translation/tests/test_text_models.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/monotonic_attention.py | https://arxiv.org/abs/1712.05382 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/utils/latency.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/monotonic_attention.py | https://arxiv.org/abs/1906.05218 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/simultaneous_translation/utils/latency.py| Deltalm_for_PyTorch/fairseq/examples/simultaneous_translation/utils/monotonic_attention.py | https://arxiv.org/abs/1906.05218 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/speech_recognition/kaldi/kaldi_decoder.py| Deltalm_for_PyTorch/fairseq/examples/speech_recognition/kaldi/kaldi_decoder.py | https://github.com/pykaldi/pykaldi | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/speech_recognition/models/vggtransformer.py| Deltalm_for_PyTorch/fairseq/examples/speech_recognition/models/vggtransformer.py | https://arxiv.org/abs/1904.11660 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/speech_text_joint_to_text/scripts/g2p_encode.py| Deltalm_for_PyTorch/fairseq/examples/speech_text_joint_to_text/scripts/g2p_encode.py | http://www.speech.cs.cmu.edu/cgi-bin/cmudict | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation_moe/README.md| Deltalm_for_PyTorch/fairseq/examples/translation_moe/translation_moe_src/logsumexp_moe.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation_moe/README.md| Deltalm_for_PyTorch/fairseq/examples/translation_moe/translation_moe_src/translation_moe.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/clib/libbase/balanced_assignment.cpp| Deltalm_for_PyTorch/fairseq/fairseq/clib/libbase/balanced_assignment.cpp | https://dspace.mit.edu/bitstream/handle/1721.1/3265/P-2108-26912652.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/clib/libbase/balanced_assignment.cpp| Deltalm_for_PyTorch/fairseq/fairseq/clib/libbase/balanced_assignment.cpp | https://github.com/bkj/auction-lap | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/clib/libnat_cuda/binding.cpp| Deltalm_for_PyTorch/fairseq/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq_cli/hydra_train.py| Deltalm_for_PyTorch/fairseq/examples/wav2vec/unsupervised/w2vu_generate.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe_utils.py| Deltalm_for_PyTorch/fairseq/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/audio/speech_to_text_dataset.py| Deltalm_for_PyTorch/fairseq/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/bart/hub_interface.py| Deltalm_for_PyTorch/fairseq/fairseq/models/bart/hub_interface.py | https://github.com/pytorch/fairseq/tree/master/examples/bart | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/bart/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/bart/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/bart/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/bart/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/bart/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/ema/ema.py| Deltalm_for_PyTorch/fairseq/fairseq/models/ema/ema.py | https://github.com/zhawe01/fairseq-gec.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/ema/ema.py| Deltalm_for_PyTorch/fairseq/fairseq/models/ema/ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/hub_interface.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/tree/master/examples/roberta | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/hub_interface.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_gottbert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_camembert.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_xlmr.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/roberta/model_xlmr.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/roberta/model_xlmr.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/roberta/model_xlmr.py| Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/convtransformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/convtransformer.py | https://arxiv.org/abs/2004.10234 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/s2t_transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/s2t_transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/berard.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/speech_to_text/s2t_transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_base.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/flores101/README.md| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/flores101/README.md| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/model_parallel/modules/multihead_attention.py| Deltalm_for_PyTorch/fairseq/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/model_parallel/modules/multihead_attention.py| Deltalm_for_PyTorch/fairseq/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/model_parallel/modules/multihead_attention.py| Deltalm_for_PyTorch/fairseq/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py| Deltalm_for_PyTorch/fairseq/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py| Deltalm_for_PyTorch/fairseq/examples/latent_depth/latent_depth_src/models/latent_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py| Deltalm_for_PyTorch/fairseq/examples/latent_depth/latent_depth_src/modules/latent_layers.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py| Deltalm_for_PyTorch/fairseq/examples/latent_depth/latent_depth_src/models/latent_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/speech_recognition/README.md| Deltalm_for_PyTorch/fairseq/examples/speech_recognition/new/decoders/flashlight_decoder.py | https://github.com/facebookresearch/flashlight/tree/master/bindings/python | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/wav2vec/unsupervised/README.md| Deltalm_for_PyTorch/fairseq/examples/wav2vec/unsupervised/scripts/normalize_and_filter_text.py | https://fasttext.cc/docs/en/language-identification.html | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/wav2vec/unsupervised/README.md| Deltalm_for_PyTorch/fairseq/examples/wav2vec/unsupervised/scripts/vads.py | https://github.com/zhenghuatan/rVADfast | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/audio/feature_transforms/specaugment.py| Deltalm_for_PyTorch/fairseq/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/speech_to_text/modules/emformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/speech_to_text/modules/augmented_memory_attention.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.08042 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/speech_to_text/modules/augmented_memory_attention.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.09137 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/models/speech_to_text/modules/emformer.py| Deltalm_for_PyTorch/fairseq/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/2005.09684 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py | https://github.com/keithito/tacotron | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py | https://pypi.python.org/pypi/Unidecode | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/cmudict.py | https://github.com/keithito/tacotron | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/speech_text_joint_to_text/scripts/g2p_encode.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/cmudict.py | http://www.speech.cs.cmu.edu/cgi-bin/cmudict | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/numbers.py | https://github.com/keithito/tacotron | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/stft.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/stft.py | https://github.com/pseeth/pytorch-stft | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/text.py | https://github.com/keithito/tacotron | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/textless_nlp/gslm/unit2speech/tacotron2/cleaners.py| Deltalm_for_PyTorch/fairseq/examples/textless_nlp/gslm/unit2speech/tacotron2/symbols.py | https://github.com/keithito/tacotron | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/examples/wav2vec/unsupervised/kaldi_self_train/st/cmd.sh| Deltalm_for_PyTorch/fairseq/examples/wav2vec/unsupervised/kaldi_self_train/st/cmd.sh | http://kaldi-asr.org/doc/queue.html | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/deltalm/examples/prepare_iwslt14.sh| Deltalm_for_PyTorch/examples/prepare_iwslt14.sh | https://github.com/pytorch/fairseq/blob/master/examples/translation/prepare-iwslt14.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/.gitmodules| Deltalm_for_PyTorch/fairseq/.gitmodules | https://github.com/ngoyal2707/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/setup.py| Deltalm_for_PyTorch/fairseq/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/setup.py| Deltalm_for_PyTorch/fairseq/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/setup.py| Deltalm_for_PyTorch/fairseq/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/.gitmodules| Deltalm_for_PyTorch/fairseq/setup.py | https://github.com/pytorch/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/conf.py| Deltalm_for_PyTorch/fairseq/docs/conf.py | https://github.com/pytorch/fairseq/tree/master/docs/ | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/conf.py| Deltalm_for_PyTorch/fairseq/docs/conf.py | http://alabaster.readthedocs.io/en/latest/installation.html#sidebars | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/conf.py| Deltalm_for_PyTorch/fairseq/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/conf.py| Deltalm_for_PyTorch/fairseq/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/conf.py| Deltalm_for_PyTorch/fairseq/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/file_utils.py| Deltalm_for_PyTorch/fairseq/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/adalm/README.md| Deltalm_for_PyTorch/fairseq/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/checkpoint_utils.py| Deltalm_for_PyTorch/fairseq/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/search.py| Deltalm_for_PyTorch/fairseq/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/search.py| Deltalm_for_PyTorch/fairseq/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/docs/make.bat| Deltalm_for_PyTorch/fairseq/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/search.py| Deltalm_for_PyTorch/fairseq/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/search.py| Deltalm_for_PyTorch/fairseq/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq_cli/hydra_train.py| Deltalm_for_PyTorch/fairseq/fairseq_cli/hydra_train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/scripts/build_sym_alignment.py| Deltalm_for_PyTorch/fairseq/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/scripts/build_sym_alignment.py| Deltalm_for_PyTorch/fairseq/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/scripts/build_sym_alignment.py| Deltalm_for_PyTorch/fairseq/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq_cli/hydra_train.py| Deltalm_for_PyTorch/fairseq/fairseq_cli/train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/indexed_dataset.py| Deltalm_for_PyTorch/fairseq/tests/test_token_block_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/adaptive_span/adaptive_span_model_wrapper.py| Deltalm_for_PyTorch/fairseq/examples/adaptive_span/adaptive_span_model_wrapper.py | https://github.com/facebookresearch/adaptive-span/blob/master/experiments/enwik8_small.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/tokenized_bleu.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-de-monolingual.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/backtranslation/sacrebleu.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/byte_level_bpe/get_data.sh| Deltalm_for_PyTorch/fairseq/examples/byte_level_bpe/get_data.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/criss/download_and_preprocess_flores_test.sh| Deltalm_for_PyTorch/fairseq/examples/criss/download_and_preprocess_flores_test.sh | https://github.com/facebookresearch/flores | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/criss/download_and_preprocess_flores_test.sh| Deltalm_for_PyTorch/fairseq/examples/criss/download_and_preprocess_flores_test.sh | https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/criss/download_and_preprocess_flores_test.sh| Deltalm_for_PyTorch/fairseq/examples/criss/download_and_preprocess_tatoeba.sh | https://github.com/facebookresearch/flores | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/criss/download_and_preprocess_tatoeba.sh| Deltalm_for_PyTorch/fairseq/examples/criss/download_and_preprocess_tatoeba.sh | https://github.com/facebookresearch/LASER | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/language_model/prepare-wikitext-103.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/language_model/prepare-wikitext-103.sh| Deltalm_for_PyTorch/fairseq/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/fast_noisy_channel/README.md| Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | https://github.com/glample/fastBPE.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | https://github.com/rsennrich/wmt16-scripts.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | https://github.com/neubig/kytea.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | https://bitbucket.org/eunjeon/mecab-ko/downloads/mecab-0.996-ko-0.9.2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | https://bitbucket.org/eunjeon/mecab-ko-dic/downloads/mecab-ko-dic-2.1.1-20180720.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | https://github.com/anoopkunchukuttan/indic_nlp_resources.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/m2m_100/install_dependecies.sh| Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/roberta/preprocess_GLUE_tasks.sh| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/bart/README.summarization.md| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/encoders/gpt2_bpe.py| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/bart/README.summarization.md| Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/speech_recognition/README.md| Deltalm_for_PyTorch/fairseq/examples/speech_recognition/w2l_decoder.py | https://github.com/facebookresearch/flashlight/tree/master/bindings/python | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/speech_to_text/prep_covost_data.py| Deltalm_for_PyTorch/fairseq/examples/speech_to_text/prep_covost_data.py | https://github.com/facebookresearch/covost | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/speech_to_text/prep_covost_data.py| Deltalm_for_PyTorch/fairseq/examples/speech_to_text/prep_covost_data.py | https://dl.fbaipublicfiles.com/covost/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt14.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt14.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt14.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation/prepare-iwslt14.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt14.sh | http://dl.fbaipublicfiles.com/fairseq/data/iwslt14/de-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation/prepare-iwslt17-multilingual.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt17-multilingual.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/de/en/de-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/byte_level_bpe/get_data.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt17-multilingual.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation/prepare-wmt14en2fr.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv.py| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation/prepare-wmt14en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation/prepare-wmt14en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation/prepare-wmt14en2fr.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/backtranslation/prepare-wmt18en2de.sh| Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/examples/translation_moe/README.md| Deltalm_for_PyTorch/fairseq/examples/translation_moe/score.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/criterions/adaptive_loss.py| Deltalm_for_PyTorch/fairseq/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/indexed_dataset.py| Deltalm_for_PyTorch/fairseq/fairseq/data/indexed_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/data/mask_tokens_dataset.py| Deltalm_for_PyTorch/fairseq/fairseq/data/mask_tokens_dataset.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/dataclass/constants.py| Deltalm_for_PyTorch/fairseq/fairseq/dataclass/constants.py | https://github.com/facebookresearch/hydra/issues/1156 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/dataclass/configs.py| Deltalm_for_PyTorch/fairseq/fairseq/dataclass/configs.py | https://github.com/facebookresearch/hydra/issues/1117 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fairseq_incremental_decoder.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv_self_att.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv_self_att.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/fconv_self_att.py| Deltalm_for_PyTorch/fairseq/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/lightconv.py| Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/criterions/adaptive_loss.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/models/transformer_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/modules/character_token_embedder.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/modules/dynamic_crf_layer.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/modules/dynamic_crf_layer.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/modules/gelu.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/checkpoint_utils.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/modules/sparse_multihead_attention.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/modules/vggblock.py| Deltalm_for_PyTorch/fairseq/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adafactor.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adam.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adam.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adam.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adam.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/bmuf.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adam.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/optim/adam.py| Deltalm_for_PyTorch/fairseq/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/scoring/tokenizer.py| Deltalm_for_PyTorch/fairseq/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/fairseq/tasks/cross_lingual_lm.py| Deltalm_for_PyTorch/fairseq/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/tasks/fairseq_task.py| Deltalm_for_PyTorch/fairseq/fairseq/tasks/fairseq_task.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/edgelm/fairseq/tasks/fairseq_task.py| Deltalm_for_PyTorch/fairseq/fairseq/tasks/fairseq_task.py | https://github.com/facebookresearch/GENRE | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/block_plugins/tasks/translation_lev_modified.py| Deltalm_for_PyTorch/fairseq/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/GAD/block_plugins/tasks/translation_lev_modified.py| Deltalm_for_PyTorch/fairseq/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/tests/speech_recognition/asr_test_base.py| Deltalm_for_PyTorch/fairseq/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm/blob/master/decoding/IAD/fairseq/tests/speech_recognition/asr_test_base.py| Deltalm_for_PyTorch/fairseq/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/m2m_100/install_dependecies.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 工具下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/binarize.py | https://dl.fbaipublicfiles.com/fairseq/models/mbart50/sentence.bpe.model -O {SPM_MODEL} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/binarize.py | https://dl.fbaipublicfiles.com/fairseq/models/mbart50/dict_250k.txt -O {SPM_VOCAB} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | http://www.seas.upenn.edu/~nlp/resources/TACL-data-release/dictionaries.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://dl.fbaipublicfiles.com/fairseq/data/nepali-penn-treebank.ne.patch | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_flores_data.sh | https://dl.fbaipublicfiles.com/fairseq/data/nepali-penn-treebank.en.patch | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_iitb.sh | http://www.cfilt.iitb.ac.in/~moses/iitb_en_hi_parallel/iitb_corpus_download/parallel.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_iitb.sh | http://www.cfilt.iitb.ac.in/~moses/iitb_en_hi_parallel/iitb_corpus_download/dev_test.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_lotus.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/indic-multilingual/indic_languages_corpus.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_ted_and_extract.py | http://phontron.com/data/ted_talks.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wat19_my.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/$WAT_MY_EN | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.00 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt16/translation-task/dev-romanian-updated.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt16/translation-task/test.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt16/translation-task/training-parallel-ep-v8.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/dev.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/rapid2016.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/test.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/test-update-1.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/training-parallel-ep-v8.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/dev.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/test.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt19/translation-task/dev.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt19/translation-task/test.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASIA2015.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2011.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2015.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2015.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2017.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/NEU2017.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt10/training-giga-fren.tar | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/dev.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/test.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt14/dev.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt14/test-full.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://translate.yandex.ru/corpus?lang=en | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/news-commentary/v14/training/news-commentary-v14.en-ru.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/news-commentary/v14/training/news-commentary-v14-wmt19.en-kk.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.gu-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.kk-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.lt-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wikititles/v1/wikititles-v1.ru-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/books.lv-en.v1.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/dcep.lv-en.v1.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/leta.v1.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/training-parallel-ep-v8.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/europarl/v9/training/europarl-v9.lt-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-nc-v8.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-un.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt13/training-parallel-un.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt14/training-parallel-nc-v9.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://www.statmt.org/wmt15/wiki-titles.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-cs.zipporah0-dedup-clean.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-et.zipporah0-dedup-clean.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-ru.zipporah0-dedup-clean.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://s3.amazonaws.com/web-language-models/paracrawl/release3/en-lt.bicleaner07.tmx.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.01 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.02 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.01 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://tilde-model.s3-eu-west-1.amazonaws.com/rapid2016.en-lt.tmx.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://dl.fbaipublicfiles.com/fasttext/supervised-models/lid.176.bin -O {LID_MODEL} | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://translate.yandex.ru/corpus?lang=en | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/NEU2017.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2017.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/Datum2015.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2015.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASICT2011.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt/CASIA2015.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | https://translate.yandex.ru/corpus?lang=en | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt19_and_before.py | http://nlp.nju.edu.cn/cwmt-wmt | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.00 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.01 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.00 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.01 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.02 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.zh-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ta-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ru-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ps-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.ja-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.iu-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wikititles/v2/wikititles-v2.de-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-zh.langid.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ta.langid.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ru.langid.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ja.langid.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.de-en.langid.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://tilde-model.s3-eu-west-1.amazonaws.com/rapid2019.en-pl.tmx.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://nlp.stanford.edu/projects/jesc/data/split.tar.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/pmindia/v1/parallel/pmindia.v1.ta-en.tsv | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://preon.iiit.ac.in/~jerin/resources/datasets/pib-v0.tar | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://s3.amazonaws.com/web-language-models/paracrawl/release5.1/en-de.txt.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-ru.zipporah0-dedup-clean.tgz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.kecl.ntt.co.jp/icl/lirg/jparacrawl/release/2.0/bitext/en-ja.tar.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/wmt20-sent.en-ps.xz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/wmt20-sent.en-km.xz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://nrc-digital-repository.canada.ca/eng/view/dataset/?id=c7e34fa7-7629-43c2-bd6d-19b32bf64f60 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-zh.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-ru.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-ja.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.de-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://preon.iiit.ac.in/~jerin/resources/datasets/mkb-v0.tar | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.phontron.com/kftt/download/kftt-data-1.0.tar.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.statmt.org/europarl/v10/training/europarl-v10.de-en.tsv.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/rapid/RAPID_2019.de-en.xlf.gz | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://dl.fbaipublicfiles.com/fasttext/supervised-models/lid.176.bin | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | https://raw.githubusercontent.com/nlpc-uom/English-Tamil-Parallel-Corpus/master/En-Ta%20Corpus/En-Ta%20English.txt | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/dev.tgz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/ps-parallel.tgz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/multilingual/data_scripts/download_wmt20.sh | http://data.statmt.org/wmt20/translation-task/ps-km/km-parallel.tgz | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s4.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/speech_recognition/datasets/prepare-librispeech.sh | www.openslr.org/resources/12 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/speech_to_text/prep_covost_data.py | https://dl.fbaipublicfiles.com/covost/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-iwslt14.sh | http://dl.fbaipublicfiles.com/fairseq/data/iwslt14/de-en.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt20.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Deltalm_for_PyTorch/fairseq/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/public_address_statement.md index b650f327ed4ec7881f4151ee04c8d7086d8ce4bc..64c1487c9c9bbb87d8139b0c613c6851a6e5617e 100644 --- a/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/public_address_statement.md @@ -1,481 +1,91 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -|开源代码引入|https://github.com/facebookresearch/fairseq|setup.py|https://github.com/pytorch/fairseq|源仓地址| -|开源代码引入|https://github.com/facebookresearch/fairseq|setup.py|https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl|依赖包地址| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/dataclass/configs.py|https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html|参数说明| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/roberta/model.py|http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/roberta/model.py|http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/roberta/model.py|http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/roberta/model.py|http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/speech_to_text/s2t_transformer.py|http://dl.fbaipublicfiles.com/fairseq/s2t|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz|模型链接| -|开源代码引入|https://github.com/facebookresearch/fairseq|fairseq/models/transformer_lm.py|https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz|模型链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.gitmodules|Fairseq_Transformer_wmt18_for_PyTorch/.gitmodules |https://github.com/ngoyal2707/Megatron-LM|Megatron-LM在开源社区中的url链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/CONTRIBUTING.md|Fairseq_Transformer_wmt18_for_PyTorch/.pre-commit-config.yaml |https://github.com/pre-commit/pre-commit-hooks|.pre-commit-config.yaml中的开源url链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.pre-commit-config.yaml|Fairseq_Transformer_wmt18_for_PyTorch/.pre-commit-config.yaml |https://github.com/ambv/black|.pre-commit-config.yaml中的开源repo链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.pre-commit-config.yaml|Fairseq_Transformer_wmt18_for_PyTorch/.pre-commit-config.yaml |https://gitlab.com/pycqa/flake8|.pre-commit-config.yaml中的开源repo链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.pre-commit-config.yaml|Fairseq_Transformer_wmt18_for_PyTorch/.pre-commit-config.yaml |https://github.com/pycqa/isort|.pre-commit-config.yaml中的开源repo链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|Fairseq_Transformer_wmt18_for_PyTorch/setup.py |https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl|setuptools的torch-cpu开源whl下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py|Fairseq_Transformer_wmt18_for_PyTorch/setup.py |https://github.com/pytorch/fairseq|setuptools的fairseq开源下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py |https://github.com/pytorch/fairseq/tree/main/docs/|conf.py中的fairseq开源引用链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py |http://docs.scipy.org/doc/numpy|conf.py中的numpy开源引用链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py |https://docs.python.org|conf.py中的python开源引用链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py |https://pytorch.org/docs/master|conf.py中的torch开源引用链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz|wmt15数据集在开源社区中的shuffled.v2.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz|wmt16数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz|wmt17数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz|wmt18数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz|wmt18数据集training-parallel-nc-v13在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://data.statmt.org/wmt18/translation-task/rapid2016.tgz|wmt18数据集rapid2016在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt13数据集dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://statmt.org/wmt14/test-full.tgz|wmt13数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/sacrebleu.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/backtranslation/tokenized_bleu.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/byte_level_bpe/get_data.sh |https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz|wit3.fbk.eu的fr-en选项在开源社区上的tgz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh |https://github.com/facebookresearch/flores|facebookresearch_flores在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh |https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz|flores_wikipedia_en_ne_si_test_sets在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/criss/download_and_preprocess_tatoeba.sh |https://github.com/facebookresearch/flores|flores在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/criss/download_and_preprocess_tatoeba.sh |https://github.com/facebookresearch/LASER|LASER在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/criss/unsupervised_mt/eval.sh |https://github.com/moses-smt/mosesdecoder|mosesdecoder在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/textless_nlp/pgslm/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/hubert/tests/test_feature_and_unit.sh |https://dl.fbaipublicfiles.com/hubert/hubert_base_ls960.pt|hubert权重hubert_base_ls960在开源社区的pt下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/hubert/tests/test_feature_and_unit.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/hubert/tests/test_feature_and_unit.sh |https://dl.fbaipublicfiles.com/hubert/hubert_large_ll60k.pt|hubert权重hubert_large_ll60k在开源社区的pt下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/hubert/tests/test_feature_and_unit.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/hubert/tests/test_feature_and_unit.sh |https://dl.fbaipublicfiles.com/hubert/hubert_xtralarge_ll60k.pt|hubert权重hubert_xtralarge_ll60k在开源社区的pt下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/hubert/tests/test_feature_and_unit.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/hubert/tests/test_feature_and_unit.sh |https://dl.fbaipublicfiles.com/hubert/hubert_base_ls960_L9_km500.bin|hubert权重hubert_base_ls960_L9_km500在开源社区的bin下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/hubert/tests/test_finetuned_asr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/hubert/tests/test_finetuned_asr.sh |https://dl.fbaipublicfiles.com/hubert/hubert_large_ll60k_finetune_ls960.pt|hubert权重hubert_large_ll60k_finetune_ls960在开源社区的pt下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/hubert/tests/test_finetuned_asr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/hubert/tests/test_finetuned_asr.sh |https://dl.fbaipublicfiles.com/hubert/hubert_xtralarge_ll60k_finetune_ls960.pt|hubert权重hubert_xtralarge_ll60k_finetune_ls960在开源社区的pt下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz|wmt18数据集training-parallel-nc-v13在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://data.statmt.org/wmt18/translation-task/rapid2016.tgz|wmt18数据集rapid2016在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt13数据集dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://statmt.org/wmt14/test-full.tgz|wmt13数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |https://github.com/glample/fastBPE.git|fastBPE在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/prepare-wikitext-103.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/language_model/prepare-wikitext-103.sh |https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip|wikitext-103-v1在开源社区的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/rsennrich/wmt16-scripts.git|wmt16-scripts在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/neubig/kytea.git|kytea在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://bitbucket.org/eunjeon/mecab-ko/downloads/mecab-0.996-ko-0.9.2.tar.gz|mecab-0.996-ko-0.9.2数据集在开源社区的tar.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://bitbucket.org/eunjeon/mecab-ko-dic/downloads/mecab-ko-dic-2.1.1-20180720.tar.gz|mecab-ko-dic-2.1.1-20180720数据集在开源社区的tar.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/anoopkunchukuttan/indic_nlp_resources.git|indic_nlp_resources在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/install_dependecies.sh |http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip|wat2020.my-en数据集在开源社区的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/tokenizers/tokenizer_ar.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh |http://alt.qcri.org/tools/arabic-normalizer/|Arabic tools安装提示引用的开源社区url链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/MMPT/setup.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/MMPT/setup.py |https://github.com/pytorch/fairseq/examples/MMPT|setuptools的fairseq开源下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-Tatoeba/v20190709/tmx/en-xh.tmx.gz|Tatoeba_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-wikimedia/v20190628/tmx/en-xh.tmx.gz|wikimedia_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-memat/v1/tmx/en-xh.tmx.gz|memat_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-bible-uedin/v1/tmx/en-xh.tmx.gz|uedin_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-GNOME/v1/tmx/en-xh.tmx.gz|GNOME_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-XhosaNavy/v1/tmx/en-xh.tmx.gz|XhosaNavy_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-KDE4/v2/tmx/en-xh.tmx.gz|KDE4_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/tmx/en-xh.tmx.gz|Ubuntu_en-xh.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-Tatoeba/v20190709/tmx/af-en.tmx.gz|Tatoeba_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-bible-uedin/v1/tmx/af-en.tmx.gz|uedin_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-GNOME/v1/tmx/af-en.tmx.gz|GNOME_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-QED/v2.0a/tmx/af-en.tmx.gz|QED_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-KDE4/v2/tmx/af-en.tmx.gz|KDE4_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-OpenSubtitles/v2018/tmx/af-en.tmx.gz|OpenSubtitles_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-SPC/v1/tmx/af-en.tmx.gz|SPC_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_af_xh.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_af_xh.sh |https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/tmx/af-en.tmx.gz|Ubuntu_af-en.tmx在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-GNOME/v1/moses/en-si.txt.zip|OPUS-GNOME_en-si.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/moses/en-si.txt.zip|OPUS-Ubuntu_en-si.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-KDE4/v2/moses/en-si.txt.zip|OPUS-KDE4_en-si.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-OpenSubtitles/v2018/moses/en-si.txt.zip|OPUS-OpenSubtitles_en-si.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-GNOME/v1/moses/en-ne.txt.zip|OPUS-GNOME_en-ne.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-Ubuntu/v14.10/moses/en-ne.txt.zip|OPUS-Ubuntu_en-ne.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://object.pouta.csc.fi/OPUS-KDE4/v2/moses/en-ne.txt.zip|OPUS-KDE4_en-ne.txt在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |http://www.casmacat.eu/corpus/global-voices/globalvoices.ne-en.xliff.gz|globalvoices.ne-en.xliff在开源社区上的gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://github.com/christos-c/bible-corpus-tools.git|bible-corpus-tools在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://github.com/christos-c/bible-corpus/archive/v1.2.1.tar.gz|bible-corpus在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |http://www.cle.org.pk/Downloads/ling_resources/parallelcorpus/NepaliTaggedCorpus.zip|ling_resources的NepaliTaggedCorpus在开源社区上的zip链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://dl.fbaipublicfiles.com/fairseq/data/nepali-penn-treebank.en.patch|nepali-penn-treebank.en在开源社区上的patch链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://dl.fbaipublicfiles.com/fairseq/data/nepali-penn-treebank.ne.patch|nepali-penn-treebank.ne在开源社区上的patch链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区上的git链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |http://www.seas.upenn.edu/~nlp/resources/TACL-data-release/dictionaries.tar.gz|TACL-data-release的dictionaries在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_flores_data.sh |https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz|wikipedia_en_ne_si_test_sets在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_iitb.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_iitb.sh |http://www.cfilt.iitb.ac.in/~moses/iitb_en_hi_parallel/iitb_corpus_download/parallel.tgz|iitb_corpus在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_iitb.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_iitb.sh |http://www.cfilt.iitb.ac.in/~moses/iitb_en_hi_parallel/iitb_corpus_download/dev_test.tgz|iitb_corpus在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_iwslt_and_extract.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_iwslt_and_extract.sh |https://wit3.fbk.eu/archive|iwslt在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_lotus.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_lotus.sh |http://lotus.kuee.kyoto-u.ac.jp/WAT/indic-multilingual/indic_languages_corpus.tar.gz|indic_languages_corpus在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_ted_and_extract.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_ted_and_extract.py |http://phontron.com/data/ted_talks.tar.gz|ted_talks在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wat19_my.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wat19_my.sh |http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data|my-en-data在开源社区上的下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://dl.fbaipublicfiles.com/fasttext/supervised-models/lid.176.bin|lid.176在开源社区上的bin下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://ufallab.ms.mff.cuni.cz/~bojar/czeng16-data/data-plaintext-format|data-plaintext-format在开源社区上的tar下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://ufal.mff.cuni.cz/czeng/download.php?f=convert_czeng16_to_17.pl.zip|convert_czeng16_to_17.pl在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://ufal.mff.cuni.cz/czeng/download.php?f=convert_czeng16_to_17.pl.zip|convert_czeng16_to_17.pl在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13_training-parallel-europarl-v7在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_training-parallel-commoncrawl在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-un.tgz|wmt13_training-parallel-un在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-nc-v8.tgz|wmt13_training-parallel-nc-v8在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/dev.tgz |wmt13_dev在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/test.tgz |wmt13_test在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13_training-parallel-europarl-v7在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_training-parallel-commoncrawl在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-un.tgz|wmt13_training-parallel-un在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14_training-parallel-nc-v9在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt10/training-giga-fren.tar |wmt10_training-giga-fren在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt14/dev.tgz |wmt14_training-parallel-europarl-v7在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt14/test-full.tgz |wmt14_training-parallel-europarl-v7在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt16/translation-task/training-parallel-ep-v8.tgz |wmt16_training-parallel-europarl-v7在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://opus.nlpl.eu/download.php?f=SETIMES/v2/tmx/en-ro.tmx.gz |tmx_en-ro在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt16/translation-task/dev-romanian-updated.tgz |wmt16_dev在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt16/translation-task/test.tgz |wmt16_test在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/CASIA2015.zip |CASIA2015在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/CASICT2011.zip |CASICT2011在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/CASICT2015.zip |CASICT2015在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/Datum2015.zip |Datum2015在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/Datum2017.zip |Datum2017在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/NEU2017.zip |NEU2017在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/training-parallel-ep-v8.tgz |wmt17_training-parallel-ep-v8在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz|training-parallel-nc-v12在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt15/wiki-titles.tgz|wmt15_wiki-titles在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://opus.nlpl.eu/download.php?f=SETIMES/v2/tmx/en-tr.tmx.gz |tmx_en-tr在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/rapid2016.tgz |wmt17_rapid2016在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/leta.v1.tgz|wmt17_translation-task_leta.v1在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/dcep.lv-en.v1.tgz|wmt17_translation-dcep.lv-en.v1在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/books.lv-en.v1.tgz|wmt17_translation-books.lv-en.v1在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.00|corpusfiles_UNv1.0.en-zh在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.01 |corpusfiles_UNv1.0.en-zh在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/CASIA2015.zip |CASIA2015在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/CASICT2011.zip |CASICT2011在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/CASICT2015.zip |CASICT2015在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/Datum2015.zip |Datum2015在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/Datum2017.zip |Datum2017在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://nlp.nju.edu.cn/cwmt-wmt/NEU2017.zip |NEU2017在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/dev.tgz |wmt17_dev在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/test-update-1.tgz |wmt17_test_zh_en在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt17/translation-task/test.tgz |wmt17_test_others在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13_training-parallel-europarl-v7在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt18/translation-task/training-parallel-ep-v8.tgz|wmt18_translation-task_training-parallel-ep-v8在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-cs.zipporah0-dedup-clean.tgz|paracrawl-release1.en-cs.zipporah0-dedup-clean在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-et.zipporah0-dedup-clean.tgz|paracrawl-release1.en-cs.zipporah0-dedup-clean在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_training-parallel-commoncrawl在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz|wmt18_training-parallel-nc-v13在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt18/translation-task/rapid2016.tgz |wmt18_rapid2016在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt18/translation-task/dev.tgz |wmt18_dev在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt18/translation-task/test.tgz |wmt18_test在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://translate.yandex.ru/corpus?lang=en|wmt18_1mcorpus在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/europarl/v9/training/europarl-v9.lt-en.tsv.gz|europarl-v9.lt-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://s3.amazonaws.com/web-language-models/paracrawl/release3/en-lt.bicleaner07.tmx.gz|en-lt.bicleaner07在开源社区上的tmx.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-ru.zipporah0-dedup-clean.tgz|paracrawl-release1.en-ru.zipporah0-dedup-clean在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_training-parallel-commoncrawl在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/news-commentary/v14/training/news-commentary-v14-wmt19.en-kk.tsv.gz|news-commentary-v14-wmt19.en-kk在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/news-commentary/v14/training/news-commentary-v14.en-ru.tsv.gz|news-commentary-v14-wmt19.en-kk在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wikititles/v1/wikititles-v1.kk-en.tsv.gz|wikititles-v1.kk-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wikititles/v1/wikititles-v1.ru-en.tsv.gz|wikititles-v1.ru-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wikititles/v1/wikititles-v1.kk-en.tsv.gz|wikititles-v1.kk-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wikititles/v1/wikititles-v1.lt-en.tsv.gz|wikititles-v1.lt-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wikititles/v1/wikititles-v1.gu-en.tsv.gz|wikititles-v1.gu-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.00|UNv1.0.en-ru在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.01|UNv1.0.en-ru在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.02|UNv1.0.en-ru在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://tilde-model.s3-eu-west-1.amazonaws.com/rapid2016.en-lt.tmx.zip|rapid2016.en-lt在开源社区上的tmx.zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |https://translate.yandex.ru/corpus?lang=en |wmt19_1mcorpus在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt19/translation-task/dev.tgz |wmt19_dev在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt19_and_before.py |http://data.statmt.org/wmt19/translation-task/test.tgz |wmt19_test在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://dl.fbaipublicfiles.com/fasttext/supervised-models/lid.176.bin|lid.176在开源社区上的bin下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区上的git下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://wit3.fbk.eu/archive/2017-01-trnted//texts/en/ja/en-ja.tgz|wit3.fbk.eu_en-ja在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://www.kecl.ntt.co.jp/icl/lirg/jparacrawl/release/2.0/bitext/en-ja.tar.gz |bitext_en-ja在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-ja.tsv.gz |news-commentary-v15.en-ja在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.ja-en.tsv.gz |wikititles-v2.ja-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ja.langid.tsv.gz |WikiMatrix.v1.en-ja.langid在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://nlp.stanford.edu/projects/jesc/data/split.tar.gz |jesc_data_split在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://www.phontron.com/kftt/download/kftt-data-1.0.tar.gz |kftt-data-1.0在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.ta-en.tsv.gz |wikititles-v2.ta-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ta.langid.tsv.gz |WikiMatrix.v1.en-ta.langid在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/pmindia/v1/parallel/pmindia.v1.ta-en.tsv |pmindia.v1.ta-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://object.pouta.csc.fi/OPUS-Tanzil/v1/moses/en-ta.txt.zip |Tanzil.en-ta在开源社区上的zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://preon.iiit.ac.in/~jerin/resources/datasets/pib-v0.tar |pib-v0在开源社区上的tar下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://preon.iiit.ac.in/~jerin/resources/datasets/mkb-v0.tar |pib-v0在开源社区上的tar下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://ufal.mff.cuni.cz/~ramasamy/parallel/data/v2/en-ta-parallel-v2.tar.gz |en-ta-parallel-v2在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://raw.githubusercontent.com/nlpc-uom/English-Tamil-Parallel-Corpus/master/En-Ta%20Corpus/En-Ta%20English.txt|English-Tamil-Parallel-Corpus在开源社区上的txt下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://github.com/nlpc-uom/English-Tamil-Parallel-Corpus/raw/master/En-Ta%20Corpus/En-Ta%20Tamil.txt|English-Tamil-Parallel-Corpus在开源社区上的txt下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://nrc-digital-repository.canada.ca/eng/view/dataset/?id=c7e34fa7-7629-43c2-bd6d-19b32bf64f60|Nunavut-Hansard-Inuktitut-English-Parallel-Corpus-3.0.1在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.iu-en.tsv.gz |wikititles-v2.iu-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/ps-km/wmt20-sent.en-km.xz |wmt20-sent.en-km在开源社区上的zx下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/ps-km/km-parallel.tgz|wmt20_km-parallel在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/ps-km/wmt20-sent.en-ps.xz |wmt20-sent.en-ps在开源社区上的xz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.ps-en.tsv.gz |wikititles-v2.ps-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/ps-km/ps-parallel.tgz|wmt20_ps-parallel在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://www.statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_km-parallel在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://www.statmt.org/europarl/v10/training/europarl-v10.de-en.tsv.gz |europarl-v10.de-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://s3.amazonaws.com/web-language-models/paracrawl/release5.1/en-de.txt.gz |paracrawl_en-de在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.de-en.tsv.gz |news-commentary-v15.de-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.de-en.tsv.gz |wikititles-v2.de-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/rapid/RAPID_2019.de-en.xlf.gz |RAPID_2019.de-en在开源社区上的xlf.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.de-en.langid.tsv.gz |WikiMatrix.v1.de-en.langid在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://github.com/amake/TMX2Corpus|tmx2corpus在开源社区上的git下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://tilde-model.s3-eu-west-1.amazonaws.com/rapid2019.en-pl.tmx.zip|rapid2019.en-pl在开源社区上的tmx.zip下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://s3.amazonaws.com/web-language-models/paracrawl/release1/paracrawl-release1.en-ru.zipporah0-dedup-clean.tgz|paracrawl-release1.en-ru.zipporah0-dedup-clean在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-ru.tsv.gz|news-commentary-v15.en-ru在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.ru-en.tsv.gz|wikititles-v2.ru-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-ru.tar.gz.00|UNv1.0.en-ru在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-ru.langid.tsv.gz|wikititles-v2.iu-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/news-commentary/v15/training/news-commentary-v15.en-zh.tsv.gz|news-commentary-v15.en-zh在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wikititles/v2/wikititles-v2.zh-en.tsv.gz|wikititles-v2.zh-en在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |https://stuncorpusprod.blob.core.windows.net/corpusfiles/UNv1.0.en-zh.tar.gz.00|UNv1.0.en-zh在开源社区上的tar.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/WikiMatrix/WikiMatrix.v1.en-zh.langid.tsv.gz|WikiMatrix.v1.en-zh.langid在开源社区上的tsv.gz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt20.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/multilingual/data_scripts/download_wmt20.sh |http://data.statmt.org/wmt20/translation-task/dev.tgz|wmt20数据集开源社区tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/womens_bios/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/preprocess_RACE.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/preprocess_RACE.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/womens_bios/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/preprocess_RACE.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt|gpt2_bpe默认dict.txt开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl|commensenseqa_train_rand_split.jsonl开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl|commensenseqa_dev_rand_split.jsonl开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl|commensenseqa_test_rand_split_no_answers.jsonl开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/womens_bios/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt|gpt2_bpe默认dict.txt开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_synthesis/preprocessing/denoiser/pretrained.py|Fairseq_Transformer_wmt18_for_PyTorch/examples/speech_synthesis/preprocessing/denoiser/pretrained.py |https://dl.fbaipublicfiles.com/adiyoss/denoiser/|pretrained.py默认denoiser开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/speech_to_speech/asr_bleu/asr_model_cfgs.json |https://dl.fbaipublicfiles.com/fairseq/wav2vec/wav2vec_vox_960h_pl.pt|wav2vec_vox_960h_pl.pt权重开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/README.md|Fairseq_Transformer_wmt18_for_PyTorch/examples/speech_to_speech/asr_bleu/asr_model_cfgs.json |https://dl.fbaipublicfiles.com/fairseq/wav2vec/dict.ltr.txt|wav2vec的dict.ltr文件开源社区txt链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_speech/asr_bleu/asr_model_cfgs.json|Fairseq_Transformer_wmt18_for_PyTorch/examples/speech_to_speech/asr_bleu/asr_model_cfgs.json |https://dl.fbaipublicfiles.com/ust_asr/hok/checkpoint_best.pt|asr_bleu的checkpoint_best.pt权重开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_speech/asr_bleu/asr_model_cfgs.json|Fairseq_Transformer_wmt18_for_PyTorch/examples/speech_to_speech/asr_bleu/asr_model_cfgs.json |https://dl.fbaipublicfiles.com/ust_asr/hok/dict.ltr.txt|asr_bleu的dict.ltr文件开源社区txt链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt14.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-iwslt14.sh |http://dl.fbaipublicfiles.com/fairseq/data/iwslt14/de-en.tgz|iwslt14_de-en数据集在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh |https://wit3.fbk.eu/archive/2017-01-trnted/texts/de/en/de-en.tgz|wit3.fbk.eu_de-en在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh |https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz|wit3.fbk.eu_fr-en在开源社区上的tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz|wmt17数据集training-parallel-nc-v12在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt17数据集dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt14/test-full.tgz|wmt14数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14数据集training-parallel-nc-v9在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-un.tgz|wmt13数据集training-parallel-un在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14数据集training-parallel-nc-v9在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt10/training-giga-fren.tar|wmt10数据集training-parallel-giga-fren在开源社区的tar下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt14/test-full.tgz|wmt14数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt21/eval.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/wmt21/eval.sh |https://dl.fbaipublicfiles.com/fairseq/models|fairseq模型在开源社区的url下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt21/eval.sh|Fairseq_Transformer_wmt18_for_PyTorch/examples/wmt21/eval.sh |https://github.com/wmt-conference/wmt21-news-systems|wmt21-news-systems在开源社区的git下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/configs.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/dataclass/configs.py |https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html|slowmo_ddp在开源社区上的html链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz|fconv_self_att权重stories_checkpoint开源社区tar.gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz|fconv_self_att权重stories_checkpoint开源社区tar.gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2|fconv_self_att数据集stories_test开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2|wmt14.v2.en-fr.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2|wmt14.en-de.fconv-py.tar开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2|wmt17.v2.en-de.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz|'lightconv.no_glu.iwslt14.de-en'在开源社区上的iwslt14.de-en.lightconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz|'dynamicconv.no_glu.iwslt14.de-en'在开源社区上的iwslt14.de-en.dynamicconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz|'lightconv.no_glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz|'dynamicconv.no_glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt17.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt17.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz|'lightconv.glu.wmt17.zh-en'在开源社区上的wmt17.zh-en.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt17.zh-en'在开源社区上的wmt17.zh-en.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2|"transformer_lm.gbw.adaptive_huge"在开源社区上的adaptive_lm_gbw_huge.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2|"transformer_lm.wiki103.adaptive"在开源社区上的adaptive_lm_wiki103.v2.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2|wmt19.en在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2|wmt19.de在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2|wmt19.ru在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz|wmt20.en在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz|wmt20.ta在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz|wmt20.iu.news在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz|wmt20.iu.nh在开源社区上的tar.bz2下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/xm_transformer_unity.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py |http://dl.fbaipublicfiles.com/fairseq/s2t|fairseq_s2t在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/xm_transformer_unity.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/xm_transformer_unity.py |http://dl.fbaipublicfiles.com/fairseq/s2t|fairseq_s2t在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/xm_transformer_unity.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/xm_transformer.py |http://dl.fbaipublicfiles.com/fairseq/s2t|fairseq_s2t在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/text_to_speech/tts_transformer.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/fastspeech2.py |http://dl.fbaipublicfiles.com/fairseq/s2|fairseq_s2在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/text_to_speech/tts_transformer.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/tts_transformer.py |http://dl.fbaipublicfiles.com/fairseq/s2|fairseq_s2在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/text_to_speech/vocoder.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/vocoder.py |http://dl.fbaipublicfiles.com/fairseq/vocoder|fairseq_vocoder在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2|'transformer.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.transformer.tar.bz2'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2|'transformer.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.transformer.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz|'transformer.wmt18.en-de'在开源社区上的wmt18.en-de.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|'transformer.wmt19.en-de'在开源社区上的wmt19.en-de.joined-dict.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|'transformer.wmt19.en-ru'在开源社区上的wmt19.en-ru.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|'transformer.wmt19.de-en'在开源社区上的wmt19.de-en.joined-dict.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|'transformer.wmt19.ru-en'在开源社区上的wmt19.ru-en.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz|'transformer.wmt19.en-de.single_model'在开源社区上的wmt19.en-de.joined-dict.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz|'transformer.wmt19.en-ru.single_model'在开源社区上的wmt19.en-ru.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz|'transformer.wmt19.de-en.single_model'在开源社区上的wmt19.de-en.joined-dict.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz|'transformer.wmt19.ru-en.single_model'在开源社区上的wmt19.ru-en.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz|'transformer.wmt20.en-ta'在开源社区上的wmt20.en-ta.single.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz|'transformer.wmt20.en-iu.news'在开源社区上的wmt20.en-iu.news.single.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz|'transformer.wmt20.en-iu.nh'在开源社区上的wmt20.en-iu.nh.single.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz|'transformer.wmt20.ta-en'在开源社区上的wmt20.ta-en.single.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz|'transformer.wmt20.iu-en.news'在开源社区上的wmt20.iu-en.news.single.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz|'transformer.wmt20.iu-en.nh'在开源社区上的wmt20.iu-en.nh.single.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz|'transformer.flores101.mm100.615M'在开源社区上的flores101_mm100_615M.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py |https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz|'transformer.flores101.mm100.175M'在开源社区上的flores101_mm100_175M.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz|"xmod.base"在开源社区上的xmod.base.81.1M.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz|"xmod.large.prenorm"在开源社区上的xmod.large.prenorm.81.500k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz|"xmod.base.13.125k"在开源社区上的xmod.base.13.125k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz|"xmod.base.30.125k"在开源社区上的xmod.base.30.125k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz|"xmod.base.30.195k"在开源社区上的xmod.base.30.195k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz|"xmod.base.60.125k"在开源社区上的xmod.base.60.125k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz|"xmod.base.60.265k"在开源社区上的xmod.base.60.265k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz|"xmod.base.75.125k"在开源社区上的xmod.base.75.125k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/xmod/model.py|Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py |https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz|"xmod.base.75.269k"在开源社区上的xmod.base.75.269k.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech/__init__.py|Fairseq_Transformer_wmt18_for_PyTorch/tests/speech/__init__.py |https://dl.fbaipublicfiles.com/fairseq|fairseq在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech/__init__.py|Fairseq_Transformer_wmt18_for_PyTorch/tests/speech/__init__.py |https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/must_c/en_de|joint_speech_text_4_s2t_en_de在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech/test_s2s_transformer.py|Fairseq_Transformer_wmt18_for_PyTorch/tests/speech/test_convtransformer_simul_trans.py |https://dl.fbaipublicfiles.com/fairseq/|fairseq在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech/test_dual_input_wav_transformer.py|Fairseq_Transformer_wmt18_for_PyTorch/tests/speech/test_dual_input_wav_transformer.py |https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/acl2022/librispeech/finetuned|joint_speech_text_4_s2t_librispeech在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech/test_s2s_transformer.py|Fairseq_Transformer_wmt18_for_PyTorch/tests/speech/test_s2s_transformer.py |https://dl.fbaipublicfiles.com/fairseq/|fairseq在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech/__init__.py|Fairseq_Transformer_wmt18_for_PyTorch/tests/speech/test_wav2vec2.py |https://dl.fbaipublicfiles.com/fairseq|fairseq在开源社区上的url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/setup.py | Fairseq_Transformer_wmt18_for_PyTorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/setup.py | Fairseq_Transformer_wmt18_for_PyTorch/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/hydra_plugins/dependency_submitit_launcher/setup.py | Fairseq_Transformer_wmt18_for_PyTorch/hydra_plugins/dependency_submitit_launcher/setup.py | abaevski@fb.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/wav2vec/unsupervised/w2vu_generate.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq_cli/train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/wav2vec/unsupervised/w2vu_generate.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq_cli/hydra_validate.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/wav2vec/unsupervised/w2vu_generate.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq_cli/hydra_train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/trainer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/trainer.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/nonautoregressive_translation/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/tasks/translation_lev.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/speech_synthesis/utils.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/text_to_speech.py | https://arxiv.org/pdf/2011.03568.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/speech_dlm_task.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/tasks/fairseq_task.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/fairseq_task.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/tasks/fairseq_task.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/fairseq_task.py | https://github.com/facebookresearch/GENRE | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/cross_lingual_language_model/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/constrained_decoding/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/constrained_decoding/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/search.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/scaling_nmt/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adam.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adam.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/bmuf.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adam.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adam.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adam.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adam.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/optim/adafactor.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/vggblock.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/sparse_multihead_attention.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/rotary_positional_embedding.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/rotary_positional_embedding.py | https://blog.eleuther.ai/rotary-embeddings/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/rotary_positional_embedding.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/rotary_positional_embedding.py | https://arxiv.org/pdf/2104.09864.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/truncated_bptt/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/positional_encoding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/lstm_cell_with_zoneout.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/lstm_cell_with_zoneout.py | https://arxiv.org/abs/1606.01305 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/location_attention.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/location_attention.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/layerdrop/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/gelu.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/truncated_bptt/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/nonautoregressive_translation/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/dynamic_crf_layer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/s2t_conformer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/conformer_layer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/character_token_embedder.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/adaptive_softmax.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/wav2vec/utils.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/wav2vec/utils.py | https://github.com/lucidrains/local-attention/blob/master/local_attention/local_attention.py#L41 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/pointer_generator/pointer_generator_src/transformer_pg.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_base.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/text_to_speech/tts_transformer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/text_to_speech/tacotron2.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/tacotron2.py | https://arxiv.org/pdf/1712.05884.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/speech_synthesis/docs/ljspeech_example.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/fastspeech2.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/pointer_generator/pointer_generator_src/transformer_pg.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/s2t_conformer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/s2t_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/modules/emformer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/modules/emformer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/2005.09684 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/modules/convolution.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/modules/convolution.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/modules/convolution.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/modules/convolution.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.08042 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.09137 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/convtransformer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/convtransformer.py | https://arxiv.org/abs/2004.10234 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/berard.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/berard.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/berard.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/berard.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/berard.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/speech_to_text/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_to_text/berard.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/speech_to_text/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/speech_to_speech/docs/direct_s2st_discrete_units.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_speech/s2s_transformer.py | https://arxiv.org/abs/2107.05604 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_dlm/speech_dlm.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | https://discuss.pytorch.org/t/how-to-mask-and-assign-a-value-to-tensor/18437 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_dlm/modules/speech_dlm_decoder_layer.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_dlm/modules/speech_dlm_decoder.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_xlmr.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_xlmr.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_xlmr.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_xlmr.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/gottbert/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_camembert.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_camembert.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_camembert.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_camembert.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_camembert.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model_camembert.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/trainer.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/roberta/commonsense_qa/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/roberta | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/roberta/hub_interface.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/lightconv.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/conv_seq2seq/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/fairseq_incremental_decoder.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/ema/ema.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/ema/ema.py | https://github.com/zhawe01/fairseq-gec.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/ema/ema.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/ema/ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/bart/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/bart/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/bart/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/bart/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/bart/model.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/models/bart/hub_interface.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/bart | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/megatron_11b/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/megatron_11b/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/file_utils.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/roberta/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/dataclass/utils.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/dataclass/utils.py | https://github.com/omry/omegaconf/pull/911 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/dataclass/constants.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/dataclass/constants.py | https://github.com/facebookresearch/hydra/issues/1156 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/dataclass/configs.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/dataclass/configs.py | https://github.com/facebookresearch/hydra/issues/1117 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/speech_dlm_dataset.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/data/span_mask_tokens_dataset.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/span_mask_tokens_dataset.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/wav2vec/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/mask_tokens_dataset.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/tests/test_token_block_dataset.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/indexed_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/data/encoders/gpt2_bpe_utils.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/wav2vec/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/data_utils.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/data/audio/speech_to_text_dataset.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/data/audio/feature_transforms/specaugment.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/criterions/tacotron2_loss.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/criterions/tacotron2_loss.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/textless_nlp/dgslm/hubert_fisher/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/criterions/speech_dlm_criterion.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/modules/adaptive_softmax.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/clib/libnat_cuda/binding.cpp | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/clib/libbase/balanced_assignment.cpp | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/clib/libbase/balanced_assignment.cpp | https://dspace.mit.edu/bitstream/handle/1721.1/3265/P-2108-26912652.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/clib/libbase/balanced_assignment.cpp | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/clib/libbase/balanced_assignment.cpp | https://github.com/bkj/auction-lap | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/fairseq/checkpoint_utils.py | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/checkpoint_utils.py | https://pypi.org/project/huggingface-hub/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/layerdrop/README.md | Fairseq_Transformer_wmt18_for_PyTorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/backtranslation/prepare-wmt18en2de.sh | Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt18en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/conv_seq2seq/README.md | Fairseq_Transformer_wmt18_for_PyTorch/examples/translation/prepare-wmt18en2de.sh | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/docs/make.bat | Fairseq_Transformer_wmt18_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/docs/conf.py | Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/speech_to_speech/benchmarking/README.md | Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/docs/conf.py | Fairseq_Transformer_wmt18_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/quant_noise/README.md | CPM_Finetune_for_PyTorch/mpu/random.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/quant_noise/README.md | CPM_Finetune_for_PyTorch/mpu/layers.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/examples/quant_noise/README.md | CPM_Finetune_for_PyTorch/mpu/grads.py | https://github.com/pytorch/pytorch | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt18.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/speech_to_text/xm_transformer_unity.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/fastspeech2.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/text_to_speech/vocoder.py | http://dl.fbaipublicfiles.com/fairseq/vocoder | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/hydra_plugins/dependency_submitit_launcher/setup.py | abaevski@fb.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Fairseq_Transformer_wmt18_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/GPT-2_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/GPT-2_for_PyTorch/public_address_statement.md index 12c4607aa5dd4806cac1809ba802e5c5a31b7870..7dcd71ca452d2f4973afa34c4acf21cbe08451a6 100644 --- a/PyTorch/built-in/nlp/GPT-2_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/GPT-2_for_PyTorch/public_address_statement.md @@ -1,5 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入| / |GPT-2_for_PyTorch/url.ini | https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2 |enwiki-latest-pages-articles.xml在开源社区上的bz2下载链接| -| 开源代码引入 |https://github.com/microsoft/Megatron-DeepSpeed/dataset/download_vocab.sh |GPT-2_for_PyTorch/test/dataset_preprocess_gpt.sh | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json |bert_gpt2-vocab在开源社区上的json下载链接| -| 开源代码引入 | https://github.com/microsoft/Megatron-DeepSpeed/dataset/download_vocab.sh |GPT-2_for_PyTorch/test/dataset_preprocess_gpt.sh | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt|bert_gpt2-merges在开源社区上的txt下载链接| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------|--------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/GPT-2_for_PyTorch/url.ini | https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2 | 相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/GRU_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/GRU_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..8889009c0d477a120431d1b0b6c3b17982b64abe --- /dev/null +++ b/PyTorch/built-in/nlp/GRU_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/GRU_for_PyTorch/bleu_score.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/GRU_for_PyTorch/gru_1p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/GRU_for_PyTorch/gru_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/LSTM_ID0468_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/LSTM_ID0468_for_PyTorch/public_address_statement.md index 961f005975b5b585b63a4054ba9c099c39a127f0..851ef66a6cdb8709c89fcea53b05cbde964f7f0a 100644 --- a/PyTorch/built-in/nlp/LSTM_ID0468_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/LSTM_ID0468_for_PyTorch/public_address_statement.md @@ -1,4 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入| / |LSTM_ID0468_for_PyTorch/url.ini | https://github.com/kaldi-asr/kaldi.git |安装kaldi所需的开源社区git下载链接| -| 开发引入| / |LSTM_ID0468_for_PyTorch/url.ini | 90.88.145.42 |主机地址配置| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/LSTM_ID0468_for_PyTorch/NPU/8p/steps/train_ctc.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Longformer_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Longformer_for_PyTorch/public_address_statement.md index 5c2921cc77cdd47ffcb5f16b7f365e492854ef70..5e29dc07b47e747260a8d896ba65a49c379c60fd 100644 --- a/PyTorch/built-in/nlp/Longformer_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Longformer_for_PyTorch/public_address_statement.md @@ -1,52 +1,5 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:-------------------------:|:---------------------------------------------------------------------------------------------:|:--------------------:|:-----------------:| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/models?filter=fill-mask | checkpoints列表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/datasets/ | 公共数据集地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html. | 加载数据集的指导 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | multiprocessing的map方法指导 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer_pt_utils.py | ./transformers_modify/trainer_pt_utils.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer_pt_utils.py | ./transformers_modify/trainer_pt_utils.py | https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst | numpy1.21.4不支持bf16的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer_pt_utils.py | ./transformers_modify/trainer_pt_utils.py | https://github.com/pytorch/pytorch/issues/16266 | pytorch存在的issue | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://huggingface.co/docs/transformers/model_doc/auto | 适合训练的模型列表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/huggingface/peft | 使用peft库适配器的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://www.github.com/nvidia/apex | 安装APEX的教程 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/pytorch/torchdistx | 安装torchdistx的链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/intel/intel-extension-for-pytorch | 安装IPEX的教程 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | find_unused_parameters的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/pytorch/pytorch/issues/82963 | FSDP 错误的解决方法 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | optuna.create_study的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | tune.run的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | sigopt的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/pytorch/xla/pull/3609 | torchrun支撑文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/optimum-neuron | 使用TrainiumTrainer的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 使用argparse的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | 脚本参数说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://www.tensorflow.org/tensorboard | TensorBoard使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://huggingface.co/docs/safetensors | safetensor使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/intel/intel-extension-for-pytorch | IPEX安装说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://nvidia.github.io/apex/amp | Apex使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://huggingface.co/docs/transformers/performance#tf32 | TF32模式使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://www.wandb.com/ | wandb官网 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://www.mlflow.org/ | mlflow官网 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/facebookresearch/fairscale | FairScale使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/pytorch/xla/blob/master/torch_xla/distributed/fsdp/xla_fully_sharded_data_parallel.py | xla选项说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/microsoft/deepspeed | deepspeed使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | transformers训练示例脚本 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html#ray.tune.ExperimentAnalysis.get_best_trial | Ray说明文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_group | torch.distributed.init_process_group说明文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://pytorch.org/get-started/pytorch-2.0/ | torch.compile说明文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://pytorch.org/docs/2.0/generated/torch.compile.html?highlight=torch+compile#torch.compile | torch.compile最好的默认配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://nvidia.github.io/apex/amp.html | AMP optimization level说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 扩展output_dir的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/safetensors! | Safetensors使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/pytorch/pytorch/issues/82707 | 基于transformer的models的评价指标 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/longformer/modeling_longformer.py | ./transformers_modify/modeling_longformer.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/longformer/modeling_longformer.py | ./transformers_modify/modeling_longformer.py | https://huggingface.co/models?filter=longformer | longformer模型列表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/longformer/modeling_longformer.py | ./transformers_modify/modeling_longformer.py | https://pytorch.org/docs/stable/onnx.html#writes-sets | TORCH.ONNX文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/longformer/modeling_longformer.py | ./transformers_modify/modeling_longformer.py | https://github.com/pytorch/pytorch/pull/5617 | pytorch官方仓库issue,Add truncated normal | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/longformer/modeling_longformer.py | ./transformers_modify/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | torch.nn文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/longformer/modeling_longformer.py | ./transformers_modify/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | longformer论文 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Longformer_for_PyTorch/transformers_modify/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Longformer_for_PyTorch/transformers_modify/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Longformer_for_PyTorch/transformers_modify/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/M2M-100_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/M2M-100_for_PyTorch/public_address_statement.md index 84f90620e7968b5fd741156b293e80d925e67fbf..6a7aaa70624d3cf4bbccf7c3027a160721d2b54f 100644 --- a/PyTorch/built-in/nlp/M2M-100_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/M2M-100_for_PyTorch/public_address_statement.md @@ -1,351 +1,110 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.gitmodules|M2M-100_for_PyTorch/.gitmodules |https://github.com/ngoyal2707/Megatron-LM|Megatron-LM在开源社区中的url链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.github/workflows/build.yml|M2M-100_for_PyTorch/.gitmodules/workflows/build.yml |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|M2M-100_for_PyTorch/setup.py |https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl|setuptools的torch-cpu开源whl下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py|M2M-100_for_PyTorch/setup.py |https://github.com/pytorch/fairseq|setuptools的fairseq开源下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|M2M-100_for_PyTorch/docs/conf.py |https://github.com/pytorch/fairseq/tree/master/docs|conf文件的fairseq开源引用链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|M2M-100_for_PyTorch/docs/conf.py |http://docs.scipy.org/doc/numpy|conf文件中的numpy开源链接引用| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|M2M-100_for_PyTorch/docs/conf.py |https://docs.python.org|conf文件中的python开源链接引用| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|M2M-100_for_PyTorch/docs/conf.py |https://pytorch.org/docs/master|conf文件中的torch开源链接引用| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz|wmt14_news.2007.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz|wmt14_news.2008.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz|wmt14_news.2009.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz|wmt14_news.2010.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz|wmt14_news.2011.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz|wmt14_news.2012.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz|wmt15_news.2013.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz|wmt14_news.2007.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz|wmt16_news.2007.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz|wmt17_news.2007.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh |http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz|wmt18_news.2007.de.shuffled在开源社区中的.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13_training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_training-parallel-commoncrawl在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz|wmt18_training-parallel-nc-v13在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://data.statmt.org/wmt18/translation-task/rapid2016.tgz|wmt18_rapid2016在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt17_translation-task_dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh |http://statmt.org/wmt14/test-full.tgz|wmt14_training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/sacrebleu.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/backtranslation/tokenized_bleu.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh|M2M-100_for_PyTorch/examples/byte_level_bpe/get_data.sh |https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz|wit3.fbk.eu_fr-en在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md|M2M-100_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh |https://github.com/facebookresearch/flores|facebookresearch_flores在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh|M2M-100_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh |https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz|wikipedia_en_ne_si_test_sets在开源社区中的tgz链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/README.md|M2M-100_for_PyTorch/examples/criss/download_and_preprocess_tatoeba.sh |https://github.com/facebookresearch/LASER|facebookresearch_LASER在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md|M2M-100_for_PyTorch/examples/criss/unsupervised_mt/eval.sh |https://github.com/moses-smt/mosesdecoder|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13_training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13_training-parallel-commoncrawl在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz|wmt18_training-parallel-nc-v13在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://data.statmt.org/wmt18/translation-task/rapid2016.tgz|wmt18_rapid2016在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt17_translation-task_dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |http://statmt.org/wmt14/test-full.tgz|wmt14_test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh|M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh |https://github.com/glample/fastBPE.git|fastBPE在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/prepare-wikitext-103.sh|M2M-100_for_PyTorch/examples/language_model/prepare-wikitext-103.sh |https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip|wikitext-103-v1在开源社区中的zip链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/rsennrich/wmt16-scripts.git|wmt16-scripts在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/neubig/kytea.git|neubig_kytea在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://bitbucket.org/eunjeon/mecab-ko/downloads/mecab-0.996-ko-0.9.2.tar.gz|mecab-0.996-ko-0.9.2在开源社区中的tar.gz链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://bitbucket.org/eunjeon/mecab-ko-dic/downloads/mecab-ko-dic-2.1.1-20180720.tar.gz|mecab-ko-dic-2.1.1-20180720在开源社区中的tar.gz链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |https://github.com/anoopkunchukuttan/indic_nlp_resources.git|indic_nlp_resources在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/README.md|M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh |http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip|wat2020.my-en在开源社区中的zip链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/tokenizers/tokenizer_ar.sh|M2M-100_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh |http://alt.qcri.org/tools/arabic-normalizer/|安装提示-Arabic tools在开源社区中的url链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/womens_bios/README.md|M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt|gpt2_bpe默认dic.txt开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/womens_bios/README.md|M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt|gpt2_bpe默认dic.txt开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl|commensenseqa_train_rand_split默认jsonl开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl|commensenseqa_dev_rand_split默认jsonl开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl|commensenseqa_test_rand_split_no_answers默认jsonl开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/womens_bios/README.md|M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt|gpt2_bpe默认dic.txt开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/prep_covost_data.py|M2M-100_for_PyTorch/examples/speech_to_text/prep_covost_data.py |https://dl.fbaipublicfiles.com/covost/|covost默认开源社区url链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://wit3.fbk.eu/archive/2014-01/texts/de/en/de-en.tgz|iwslt14数据集开源社区tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh|M2M-100_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh |https://wit3.fbk.eu/archive/2017-01-trnted/texts/de/en/de-en.tgz|wit3.fbk.eu_de-en数据集开源社区tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh|M2M-100_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh |https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz|wit3.fbk.eu_fr-en数据集开源社区tgz下载链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz|wmt17数据集training-parallel-nc-v12在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt17数据集translation-task_dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt14/test-full.tgz|wmt14数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14数据集training-parallel-nc-v9在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-un.tgz|wmt13数据集training-parallel-un在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14数据集training-parallel-nc-v9在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt10/training-giga-fren.tar|wmt10数据集training-giga-fren在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh|M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt14/test-full.tgz|wmt14数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz|fconv_self_att权重stories_checkpoint开源社区tar.gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz|fconv_self_att权重stories_checkpoint开源社区tar.gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2|fconv_self_att数据集stories_test开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|M2M-100_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2|wmt14.v2.en-fr.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|M2M-100_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2|wmt14.en-de.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|M2M-100_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2|wmt17.v2.en-de.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz|'lightconv.no_glu.iwslt14.de-en'在开源社区上的iwslt14.de-en.lightconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz|'dynamicconv.no_glu.iwslt14.de-en'在开源社区上的iwslt14.de-en.dynamicconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz|'lightconv.no_glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz|'dynamicconv.no_glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt17.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt17.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz|'lightconv.glu.wmt17.zh-en'在开源社区上的wmt17.zh-en.lightconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|M2M-100_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt17.zh-en'在开源社区上的wmt17.zh-en.dynamicconv-glu.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|M2M-100_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2|"transformer_lm.gbw.adaptive_huge"在开源社区上的adaptive_lm_gbw_huge.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|M2M-100_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2|"transformer_lm.wiki103.adaptive"在开源社区上的adaptive_lm_wiki103.v2.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|M2M-100_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2|wmt19.en开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|M2M-100_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2|wmt19.de开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|M2M-100_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2|wmt19.ru开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2|'transformer.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.transformer.tar.bz2'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2|'transformer.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.transformer.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz|'transformer.wmt18.en-de'在开源社区上的wmt18.en-de.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|'transformer.wmt19.en-de'在开源社区上的wmt19.en-de.joined-dict.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|'transformer.wmt19.en-ru'在开源社区上的wmt19.en-ru.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|'transformer.wmt19.de-en'在开源社区上的wmt19.de-en.joined-dict.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|'transformer.wmt19.ru-en'在开源社区上的wmt19.ru-en.ensemble.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz|'transformer.wmt19.en-de.single_model'在开源社区上的wmt19.en-de.joined-dict.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz|'transformer.wmt19.en-ru.single_model'在开源社区上的wmt19.en-ru.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz|'transformer.wmt19.de-en.single_model'在开源社区上的wmt19.de-en.joined-dict.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|M2M-100_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz|'transformer.wmt19.ru-en.single_model'在开源社区上的wmt19.ru-en.single_model.tar.gz'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|M2M-100_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz|"bart.base"在开源社区上的bart.base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|M2M-100_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz|"bart.large"在开源社区上的bart.large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|M2M-100_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz|"bart.large.mnli"在开源社区上的bart.large.mnli.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|M2M-100_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz|"bart.large.cnn"在开源社区上的bart.large.cnn.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|M2M-100_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz|"bart.large.xsum"在开源社区上的bart.large.xsum.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz|"camembert"在开源社区上的camembert-base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz|"camembert.v0"在开源社区上的camembert-base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz|"camembert-base"在开源社区上的camembert-base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz|"camembert-large"在开源社区上的camembert-large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz|"camembert-base-ccnet"在开源社区上的camembert-base-ccnet.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz|"camembert-base-ccnet-4gb"在开源社区上的camembert-base-ccnet-4gb.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz|"camembert-base-wikipedia-4gb"在开源社区上的camembert-base-wikipedia-4gb.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz|"camembert-base-oscar-4gb"在开源社区上的camembert-base-oscar-4gb.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_xlmr.py |http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz|"xlmr.base"在开源社区上的xlmr.base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|M2M-100_for_PyTorch/fairseq/models/roberta/model_xlmr.py |http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz|"xlmr.large"在开源社区上的xlmr.large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|M2M-100_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz|"roberta.base"在开源社区上的roberta.base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|M2M-100_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz|"roberta.large"在开源社区上的roberta.large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|M2M-100_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz|"roberta.large.mnli"在开源社区上的roberta.large.mnli.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|M2M-100_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz|"roberta.large.wsc"在开源社区上的roberta.large.wsc.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | M2M-100_for_PyTorch/setup.py | https://github.com/pytorch/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | M2M-100_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | M2M-100_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | M2M-100_for_PyTorch/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md | M2M-100_for_PyTorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/translation_lev.py | M2M-100_for_PyTorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/cross_lingual_language_model/README.md | M2M-100_for_PyTorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | M2M-100_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | M2M-100_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | M2M-100_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/search.py | M2M-100_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md | M2M-100_for_PyTorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | M2M-100_for_PyTorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | M2M-100_for_PyTorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | M2M-100_for_PyTorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | M2M-100_for_PyTorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | M2M-100_for_PyTorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/bmuf.py | M2M-100_for_PyTorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | M2M-100_for_PyTorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | M2M-100_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | M2M-100_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | M2M-100_for_PyTorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adafactor.py | M2M-100_for_PyTorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/vggblock.py | M2M-100_for_PyTorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/sparse_multihead_attention.py | M2M-100_for_PyTorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md | M2M-100_for_PyTorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/gelu.py | M2M-100_for_PyTorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md | M2M-100_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/dynamic_crf_layer.py | M2M-100_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/character_token_embedder.py | M2M-100_for_PyTorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/adaptive_softmax.py | M2M-100_for_PyTorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md | M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md | M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py | M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py | M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py | M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt19/README.md | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py | M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/convolution.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md | M2M-100_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py | M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py | M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py | M2M-100_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py | M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/stories/README.md | M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | M2M-100_for_PyTorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | M2M-100_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | M2M-100_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | M2M-100_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fairseq_incremental_decoder.py | M2M-100_for_PyTorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py | M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | M2M-100_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | M2M-100_for_PyTorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/file_utils.py | M2M-100_for_PyTorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/README.md | M2M-100_for_PyTorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe_utils.py | M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/speech_to_text_dataset.py | M2M-100_for_PyTorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/feature_transforms/specaugment.py | M2M-100_for_PyTorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/adaptive_softmax.py | M2M-100_for_PyTorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libnat_cuda/binding.cpp | M2M-100_for_PyTorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md | M2M-100_for_PyTorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation_moe/README.md | M2M-100_for_PyTorch/examples/translation_moe/translation_moe_src/translation_moe.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation_moe/README.md | M2M-100_for_PyTorch/examples/translation_moe/translation_moe_src/logsumexp_moe.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation_moe/README.md | M2M-100_for_PyTorch/examples/translation_moe/score.py | https://arxiv.org/abs/1902.07816 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2fr.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_wmt19_and_before.py | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-wmt14en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/prepare-iwslt17-multilingual.sh | M2M-100_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/de/en/de-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/byte_level_bpe/get_data.sh | M2M-100_for_PyTorch/examples/translation/prepare-iwslt17-multilingual.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md | M2M-100_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://wit3.fbk.eu/archive/2014-01/texts/de/en/de-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/xlsr/README.md | M2M-100_for_PyTorch/examples/speech_to_text/prep_covost_data.py | https://github.com/facebookresearch/covost | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/prep_covost_data.py | M2M-100_for_PyTorch/examples/speech_to_text/prep_covost_data.py | https://dl.fbaipublicfiles.com/covost/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_recognition/models/vggtransformer.py | M2M-100_for_PyTorch/examples/speech_recognition/models/vggtransformer.py | https://arxiv.org/abs/1904.11660 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/simultaneous_translation/utils/monotonic_attention.py | M2M-100_for_PyTorch/examples/simultaneous_translation/utils/latency.py | https://arxiv.org/abs/1906.05218 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/simultaneous_translation/utils/functions.py | M2M-100_for_PyTorch/examples/simultaneous_translation/utils/functions.py | https://arxiv.org/pdf/1712.05382.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh | M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh | M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh | M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh | M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | M2M-100_for_PyTorch/examples/pointer_generator/pointer_generator_src/transformer_pg.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/README.md | M2M-100_for_PyTorch/examples/pointer_generator/pointer_generator_src/transformer_pg.py | https://arxiv.org/abs/1704.04368 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/tokenizers/tokenizer_ar.sh | M2M-100_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/rsennrich/wmt16-scripts.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/neubig/kytea.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://bitbucket.org/eunjeon/mecab-ko/downloads/mecab-0.996-ko-0.9.2.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://bitbucket.org/eunjeon/mecab-ko-dic/downloads/mecab-ko-dic-2.1.1-20180720.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | https://github.com/anoopkunchukuttan/indic_nlp_resources.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/m2m_100/install_dependecies.sh | M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/latent_depth_src/multilingual_translation_latent_depth.py | M2M-100_for_PyTorch/examples/latent_depth/latent_depth_src/multilingual_translation_latent_depth.py | https://arxiv.org/pdf/2009.13102.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | M2M-100_for_PyTorch/examples/latent_depth/latent_depth_src/modules/latent_layers.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | M2M-100_for_PyTorch/examples/latent_depth/latent_depth_src/models/latent_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/latent_depth/README.md | M2M-100_for_PyTorch/examples/latent_depth/latent_depth_src/models/latent_multilingual_transformer.py | https://arxiv.org/abs/2009.13102 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/prepare-wikitext-103.sh | M2M-100_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | https://github.com/glample/fastBPE.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md | M2M-100_for_PyTorch/examples/criss/unsupervised_mt/eval.sh | https://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/laser/README.md | M2M-100_for_PyTorch/examples/criss/download_and_preprocess_tatoeba.sh | https://github.com/facebookresearch/LASER | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/unsupervised_quality_estimation/README.md | M2M-100_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh | https://github.com/facebookresearch/flores | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/multilingual/data_scripts/download_flores_data.sh | M2M-100_for_PyTorch/examples/criss/download_and_preprocess_flores_test.sh | https://github.com/facebookresearch/flores/raw/master/data/wikipedia_en_ne_si_test_sets.tgz | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/byte_level_bpe/get_data.sh | M2M-100_for_PyTorch/examples/byte_level_bpe/get_data.sh | https://wit3.fbk.eu/archive/2017-01-trnted/texts/fr/en/fr-en.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/backtranslation/tokenized_bleu.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/backtranslation/sacrebleu.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData.sh | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/fast_noisy_channel/README.md | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/moses-smt/mosesdecoder.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-wmt18en2de.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | https://github.com/rsennrich/subword-nmt.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/prepare-de-monolingual.sh | M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/make.bat | M2M-100_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py | M2M-100_for_PyTorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_speech/benchmarking/README.md | M2M-100_for_PyTorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py | M2M-100_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.gitmodules | M2M-100_for_PyTorch/.gitmodules | https://github.com/ngoyal2707/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech_recognition/asr_test_base.py | M2M-100_for_PyTorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | M2M-100_for_PyTorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | M2M-100_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-de-monolingual.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/backtranslation/prepare-wmt18en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/m2m_100/install_dependecies.sh | http://lotus.kuee.kyoto-u.ac.jp/WAT/my-en-data/wat2020.my-en.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/m2m_100/tokenizers/tokenizer_ar.sh | http://alt.qcri.org/tools/arabic-normalizer/ | 工具下载链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/speech_recognition/datasets/prepare-librispeech.sh | www.openslr.org/resources/12 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/speech_to_text/prep_covost_data.py | https://dl.fbaipublicfiles.com/covost/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/M2M-100_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/public_address_statement.md index 609b323b75ddfe7a3173e144a55d8482924d66db..415941f723d9606251991f3c366ee6835587440a 100644 --- a/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/public_address_statement.md @@ -1,2876 +1,357 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-base-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-large-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xlarge-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-base-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-large-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xlarge-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-base-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-large-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-base-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-large-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/spiece.model | 下载预训练文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v1/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-base-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-large-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xlarge-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/albert/tokenization_albert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/albert-xxlarge-v2/resolve/main/tokenizer.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/configuration_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-mnli/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-cnn/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/facebook/bart-large-xsum/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bart/tokenization_bart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/yjernite/bart_eli5/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/mbarthez/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/barthez/tokenization_barthez_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://huggingface.co/moussaKam/barthez-orangesum-title/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://huggingface.co/vinai/bartpho-syllable/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bartpho/tokenization_bartpho.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://huggingface.co/vinai/bartpho-syllable/resolve/main/dict.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开发引入 | / | url.ini | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://huggingface.co/microsoft/beit-base-patch16-224-in22k/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-chinese/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/configuration_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-chinese/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-multilingual-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-chinese/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-uncased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-large-cased-whole-word-masking-finetuned-squad/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-cased-finetuned-mrpc/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/bert-base-german-dbmdz-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-cased-v1/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/TurkuNLP/bert-base-finnish-uncased-v1/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert/tokenization_bert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://huggingface.co/wietsedv/bert-base-dutch-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_generation/tokenization_bert_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://huggingface.co/google/bert_for_seq_generation_L-24_bbc_encoder/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://huggingface.co/cl-tohoku/bert-base-japanese-char-whole-word-masking/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://huggingface.co/vinai/bertweet-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://huggingface.co/vinai/bertweet-base/resolve/main/bpe.codes | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/configuration_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-roberta-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/big_bird/tokenization_big_bird_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/google/bigbird-base-trivia-itc/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-pubmed/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-bigpatent/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/configuration_blenderbot.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/tokenization_blenderbot_fast.py | https://huggingface.co/facebook/blenderbot-3B/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/tokenization_blenderbot_small_fast.py | https://huggingface.co/facebook/blenderbot_small-90M/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/camembert-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/Musixmatch/umberto-commoncrawl-cased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/configuration_camembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/configuration_camembert.py | https://huggingface.co/Musixmatch/umberto-wikipedia-uncased-v1/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/camembert-base/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/camembert/tokenization_camembert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/camembert-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/canine/configuration_canine.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/configuration_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/clip/tokenization_clip_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://huggingface.co/openai/clip-vit-base-patch32/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/configuration_convbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-medium-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convbert/tokenization_convbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/tokenization_convbert_fast.py | https://huggingface.co/YituTech/conv-bert-small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/configuration_convnext.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/cpm/tokenization_cpm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://huggingface.co/TsinghuaAI/CPM-Generate/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/configuration_ctrl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py | https://huggingface.co/ctrl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/tokenization_ctrl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_audio.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/data2vec/configuration_data2vec_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/data2vec/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/configuration_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-base-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/tokenization_deberta_fast.py | https://huggingface.co/microsoft/deberta-xlarge-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/configuration_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge/resolve/main/spm.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge/resolve/main/spm.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xlarge-mnli/resolve/main/spm.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://huggingface.co/microsoft/deberta-v2-xxlarge-mnli/resolve/main/spm.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/configuration_deit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-patch16-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/configuration_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/configuration_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased-finetuned-sst-2-english/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-uncased-distilled-squad/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-cased-distilled-squad/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-german-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/distilbert/tokenization_distilbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/tokenization_distilbert_fast.py | https://huggingface.co/distilbert-base-multilingual-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/configuration_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-single-nq-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-ctx_encoder-multiset-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-single-nq-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-question_encoder-multiset-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-single-nq-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/dpr/tokenization_dpr_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/tokenization_dpr_fast.py | https://huggingface.co/facebook/dpr-reader-multiset-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-base-generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-large-generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-generator/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-generator/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-generator/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-small-discriminator/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-base-discriminator/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/electra/tokenization_electra_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/tokenization_electra_fast.py | https://huggingface.co/google/electra-large-discriminator/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/configuration_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_small_cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_uncased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_base_cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/flaubert/tokenization_flaubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://huggingface.co/flaubert/flaubert_large_cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/configuration_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/configuration_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fnet/tokenization_fnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/google/fnet-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-src.json | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/vocab-tgt.json | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/fsmt/tokenization_fsmt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://huggingface.co/stas/tiny-wmt19-en-de/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/configuration_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/small-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/medium-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/intermediate-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/large-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/tokenization_funnel_fast.py | https://huggingface.co/funnel-transformer/xlarge-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/configuration_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-medium/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/gpt2-xl/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt2/tokenization_gpt2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://huggingface.co/distilgpt2/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gptj/configuration_gptj.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/gpt_neo/configuration_gpt_neo.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/herbert/tokenization_herbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/herbert/tokenization_herbert_fast.py | https://huggingface.co/allegro/herbert-base-cased/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/hubert/configuration_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ibert/configuration_ibert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://huggingface.co/kssteven/ibert-roberta-large-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/configuration_layoutlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/configuration_layoutlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-base-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | https://huggingface.co/microsoft/layoutlm-large-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-large-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/configuration_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/tokenization_led.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/led/tokenization_led_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/tokenization_led_fast.py | https://huggingface.co/allenai/led-base-16384/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/configuration_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-finetuned-triviaqa/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-base-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/longformer/tokenization_longformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/tokenization_longformer_fast.py | https://huggingface.co/allenai/longformer-large-4096-extra.pos.embd.only/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/configuration_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/configuration_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-base/resolve/main/entity_vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/luke/tokenization_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://huggingface.co/studio-ousia/luke-large/resolve/main/entity_vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/lxmert/tokenization_lxmert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/tokenization_lxmert_fast.py | https://huggingface.co/unc-nlp/lxmert-base-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/configuration_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_418M/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://huggingface.co/facebook/m2m100_1.2B/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/configuration_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 下载词汇编码表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://cdn-datasets.huggingface.co/language_codes/iso-639-3.csv | 下载词汇编码表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://object.pouta.csc.fi/Tatoeba-MT-models | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/OPUS-MT-train/blob/master/models/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/convert_marian_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/{repo_root}/tree/master/models/{opus_name}/README.md | 下载说明文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/source.spm | 下载spm文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/target.spm | 下载spm文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/marian/tokenization_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/configuration_maskformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/maskformer-swin-base-ade/blob/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/configuration_mbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-en-ro/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart/tokenization_mbart_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/facebook/mbart-large-cc25/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mbart50/tokenization_mbart50_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/facebook/mbart-large-50-one-to-many-mmt/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mluke/tokenization_mluke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://huggingface.co/studio-ousia/mluke-base/resolve/main/entity_vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/configuration_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/configuration_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/tokenization_mobilebert_fast.py | https://huggingface.co/google/mobilebert-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/configuration_mpnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/mpnet/tokenization_mpnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://huggingface.co/microsoft/mpnet-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/nystromformer/configuration_nystromformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/configuration_openai.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/openai/tokenization_openai_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/tokenization_openai_fast.py | https://huggingface.co/openai-gpt/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/configuration_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/google/pegasus-xsum/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/configuration_perceiver.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-base/resolve/main/bpe.codes | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/phobert/tokenization_phobert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/phobert/tokenization_phobert.py | https://huggingface.co/vinai/phobert-large/resolve/main/bpe.codes | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/configuration_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-base/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-c-cpp-defect-detection/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-cs-java/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-en_XX-java/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-go-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-clone-detection/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-cs/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-java-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-javascript-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-php-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-python-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-medium/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-refine-java-small/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/plbart/tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://huggingface.co/uclanlp/plbart-ruby-en_XX/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/configuration_poolformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/prophetnet/configuration_prophetnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/prophetnet/tokenization_prophetnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://huggingface.co/microsoft/prophetnet-large-uncased/resolve/main/prophetnet.tokenizer | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qdqbert/configuration_qdqbert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rag/retrieval_rag.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder/resolve/main/tokenizer.jsont | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-encoder/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-scorer/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-cc-news-pretrained-openqa/aresolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-openqa/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-nq-reader/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-openqa/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://huggingface.co/google/realm-orqa-wq-reader/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/configuration_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/configuration_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/configuration_reformer.py | https://huggingface.co/google/reformer-enwik8/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/reformer/tokenization_reformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/google/reformer-crime-and-punishment/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/configuration_rembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/google/rembert/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/sentencepiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/rembert/tokenization_rembert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/google/rembert/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/resnet/configuration_resnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50/blob/main/config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/distilbert-base-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/retribert/tokenization_retribert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/retribert/tokenization_retribert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/retribert/tokenization_retribert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/retribert/tokenization_retribert_fast.py | https://huggingface.co/yjernite/retribert-base-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/configuration_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/configuration_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-mnli/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/distilroberta-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-base-openai-detector/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://huggingface.co/roberta-large-openai-detector/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_small/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_chinese_char_base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_discriminator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/roformer/tokenization_roformer_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer_fast.py | https://huggingface.co/junnyu/roformer_small_generator/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/configuration_segformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew/configuration_sew.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/sew_d/configuration_sew_d.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/configuration_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/tokenization_speech_to_text_2.py | https://huggingface.co/facebook/s2t-wav2vec2-large-en-de/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/configuration_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-base-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/splinter/tokenization_splinter_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://huggingface.co/tau/splinter-large-qass/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/configuration_squeezebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/configuration_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-uncased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/tokenization_squeezebert_fast.py | https://huggingface.co/squeezebert/squeezebert-mnli-headless/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/configuration_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-3b/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/configuration_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-11b/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-small/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-3b/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/t5/tokenization_t5_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/t5-11b/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-large-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-base-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-medium-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-small-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-tiny-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-sqa/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wtq/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-wikisql-supervised/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://huggingface.co/google/tapas-mini-finetuned-tabfact/resolve/main/vocab.txt | 下载词表文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/configuration_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/vocab.pkl | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://huggingface.co/transfo-xl-wt103/resolve/main/corpus.bin | 下载权重文件 | -| 开发引入 | / | url.ini | https://huggingface.co/microsoft/trocr-base/resolve/main/config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/facebook/unispeech-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/facebook/unispeech_sat-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/configuration_van.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/Visual-Attention-Network/van-base/blob/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Tiny-original/resolve/main/van_tiny_754.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Small-original/resolve/main/van_small_811.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Base-original/resolve/main/van_base_828.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/van/convert_van_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://huggingface.co/Visual-Attention-Network/VAN-Large-original/resolve/main/van_large_839.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/configuration_vilt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm/blob/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://github.com/dandelin/ViLT/releases/download/200k/vilt_200k_mlm_itm.ckpt | 下载权重文件 | -| 开发引入 | / | url.ini | https://fki.tic.heia-fr.ch/static/img/a01-122-02-00.jpg | 下载数据集图片 | -| 开发引入 | / | url.ini | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 下载数据集图片 | -| 开发引入 | / | url.ini | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vcr-coco-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-nlvr2-coco-pre/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/configuration_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/vit-base-patch16-224/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/convert_dino_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/configuration_vit_mae.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 下载数据集图片 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/configuration_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://huggingface.co/facebook/wav2vec2-lv-60-espeak-cv-ft/resolve/main/tokenizer_config.json | 下载预训练配置文件 | -| 开发引入 | / | url.ini | https://huggingface.co/facebook/wavlm-base-960h/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/configuration_xglm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xglm/tokenization_xglm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/facebook/xglm-564M/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/vocab.json | 下载词汇文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-en-2048/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-ende-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enfr-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-enro-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-tlm-xnli15-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-xnli15-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-enfr-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-clm-ende-1024/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-17-1280/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://huggingface.co/xlm-mlm-100-1280/resolve/main/merges.txt | 下载合并表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/configuration_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://huggingface.co/microsoft/xprophetnet-large-wiki100-cased/resolve/main/prophetnet.tokenizer | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/configuration_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/sentencepiece.bpe.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-base/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-dutch/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll02-spanish/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-english/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/xlm-roberta-large-finetuned-conll03-german/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/facebook/xlm-roberta-xxl/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/configuration_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/config.json | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/spiece.model | 下载预训练配置文件 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-base-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/xlnet/tokenization_xlnet_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/xlnet-large-cased/resolve/main/tokenizer.json | 下载配置信息 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/yoso/configuration_yoso.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096/resolve/main/config.json | 下载预训练配置文件 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/GoogleCloudPlatform/ml-testing-accelerators.git|ml-testing-accelerators在开源社区上的git链接引用| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html|config.yml文件中torch包的html下载链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/kpu/kenlm/archive/master.zip|config.yml文件中kenlm库的zip下载链接| -| 开源代码引入| https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | config.yml中对usr.email的配置选项| -| 开源代码引入| https://github.com/huggingface/transformers/blob/main/.circleci/create_circleci_config.py | MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | config.yml中对usr.email的配置选项| -| 开发引入 | / | MT5_ID4146_for_PyTorch/url.ini |thomas@huggingface.co |setuptools的author_email配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/fsner/setup.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | https://github.com/huggingface/transformers/tree/master/examples/research_projects/fsner |setuptools的url配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml |https://github.com/facebookresearch/detectron2.git|detectron2模型在开源社区中的源码链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/CITATION.cff |https://github.com/huggingface/transformers|CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/README_zh-hant.md|MT5_ID4146_for_PyTorch/transformers/CITATION.cff |https://www.aclweb.org/anthology/2020.emnlp-demos.6|CITATION文件中url的配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/setup.py | https://github.com/huggingface/transformers | setuptools中url的配置选项 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|MT5_ID4146_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/check_repo.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/huggingface/doc-builder|Dockerfile文件中transformers的git链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://data.pyg.org/whl/torch|Dockerfile文件中torch的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile |https://pypi.ngc.nvidia.com|Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/benchmark/benchmark.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-gpu/Dockerfile |https://github.com/NVIDIA/apex|Dockerfile文件中apex的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/deepspeed/test_deepspeed.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-deepspeed-latest-gpu/Dockerfile |https://github.com/microsoft/DeepSpeed|Dockerfile文件中apex在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/huggingface/transformers |Dockerfile文件中transformers的git链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://data.pyg.org/whl/torch |Dockerfile文件中torch包的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/en/tasks/document_question_answering.mdx|MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile |https://github.com/facebookresearch/detectron2.git|kenlm库在开源社区中的zip包下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docker/transformers-pytorch-tpu/Dockerfile|MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh|Dockerfile文件中miniconda在开源社区中的的sh链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile |https://github.com/huggingface/transformers.git|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/docker/transformers-tensorflow-gpu/Dockerfile |https://github.com/huggingface/transformers|Dockerfile文件中transformers的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt|MT5_ID4146_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/jax-projects/hybrid_clip/requirements.txt|MT5_ID4146_for_PyTorch/transformers/examples/flax/vision/requirements.txt |https://download.pytorch.org/whl/torch_stable.html|requirements文件中torch_stable在开源社区中的html链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/vit_msn/modeling_vit_msn.py|MT5_ID4146_for_PyTorch/transformers/examples/flax/vision/run_image_classification.py |https://huggingface.co/models?filter=vit|flax模型开源社区上的url链接配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/valid.txt|CONLL2003数据集在开源社区上的valid.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/test.txt|CONLL2003数据集在开源社区上的test.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_chunk.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run_chunk.sh |https://github.com/davidsbatista/NER-datasets/raw/master/CONLL2003/train.txt|CONLL2003数据集在开源社区上的train.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-dev.conllu|下载dev数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-test.conllu|下载test数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run_pos.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run_pos.sh |https://github.com/UniversalDependencies/UD_English-EWT/raw/master/en_ewt-ud-train.conllu|下载train数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1Jjhbal535VVz2ap4v4r_rN1UEHTdLK5P|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1ZfRcQThdtAR5PPRjIDtrVP7BtXSCUBbm|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/legacy/token-classification/run.sh|MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run.sh |https://drive.google.com/uc?export=download&id=1u9mb7kNJHWQCWyweMDRMuTFoOHOfeBTH|将更改迁移到NLP数据集| -| 开发引入 | / |MT5_ID4146_for_PyTorch/url.ini |https://github.com/huggingface/transformers/tree/master/examples/research_projects/fsner| setuptools的url配置选项| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/trainer/test_trainer.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/fsner/setup.py |https://github.com/huggingface/transformers/issues|setuptools的Bug Tracker在开源社区中的链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://huggingface.co/front/assets/huggingface_logo.svg|获取huggingface开源社区header_html| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/longform-qa/eli5_app.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py |https://en.wikipedia.org/wiki |获取wiki_url开源社区链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt |https://github.com/huggingface/transformers.git|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/movement-pruning/requirements.txt|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt |https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers|requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/performer/modeling_flax_performer_utils.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer_utils.py |https://github.com/google-research/google-research/blob/master/performer/fast_self_attention/fast_self_attention.py|modeling_flax_performer_utils文件在开源社区上的url配置| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt|pplm在开源社区上的txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/pplm/run_pplm.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py |https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt|pplm在开源社区上的pt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile |https://pypi.ngc.nvidia.com |Dockerfile文件中pytorch-quantization的url链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/_test_bash_script.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/_test_bash_script.py |https://cdn-datasets.huggingface.co/translation/wmt_en_ro-tr40k-va0.5k-te0.5k.tar.gz|wmt数据集在开源社区中的tar.gz链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_bart_tiny.sh |https://cdn-datasets.huggingface.co/summarization/cnn_tiny.tgz|cnn_tiny数据集在开源社区中的tgz链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/sagemaker/scripts/tensorflow/requirements.txt|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt |https://github.com/huggingface/transformers.git |requirements文件中transformers在开源社区中的的git链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://cdn.huggingface.co |获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1x_G2cjvM1nW5hjAB8-vWxRqtQTlmIaQU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1oA2aqZlVNj5FarxBlNXEHpBS4lRetTzU|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1Wup2D318QYBFPW_NKI1mfP_hXOfmUI9r|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1mNufoynJ9-Zy1kJh2TA_lHm2squji0i9|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt16.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt16.sh |https://drive.google.com/uc?id=1iO7um-HWoNoRKDtw27YUSgyeubn9uXqj|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1j6z9fYdlUyOYsh7KJoumRlr1yHczxR5T|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=1yT7ZjqfvUYOBXvMjeY8uGRHQFWoSo8Q5|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-allenai-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-allenai-wmt19.sh |https://drive.google.com/uc?id=15gAzHeRUCs-QV8vHeTReMPEh1j8excNE|将更改迁移到NLP数据集| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/convert-facebook-wmt19.sh|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|wmt19数据集在开源社区上的tar.gz下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1907?run_id=6937 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1914?run_id=6724 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1909?run_id=6862 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/fsmt/gen-card-facebook-wmt19.py|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |http://matrix.statmt.org/matrix/output/1902?run_id=6750 |wmt19数据集的语言转换配置在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py |https://github.com/huggingface/transformers|wmt16数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/update_metadata.py|MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py |https://github.com/huggingface/transformers|wmt19数据集的transformers在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/scripts/tatoeba/upload_models.sh|MT5_ID4146_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh |https://huggingface.co/Helsinki-NLP/$model_name|Helsinki-NLP在开源社区上的git下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py |https://s3.amazonaws.com/models.huggingface.co/bert|获取huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/hub.py|MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py |https://cdn.huggingface.co|获取huggingface开源社区版本| -| 开发引入 | / |MT5_ID4146_for_PyTorch/url.ini |https://moon-staging.huggingface.co|获取moon-staging.huggingface开源社区版本| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/utils/import_utils.py|MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py |https://pypi.ngc.nvidia.com|安装python第三方库nvidia开源社区url下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/modeling_tf_utils.py|MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py |https://huggingface.co/{repo_path_or_name|model获取开源社区repo链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/commands/add_new_model_like.py|MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py |https://huggingface.co/{new_model_patterns.checkpoint}/resolve/main/config.json|使用json添加新model开源社区链接引用| -| 开发引入 | / |MT5_ID4146_for_PyTorch/url.ini |https://huggingface.co/api/models |测试huggingface的api可用| -| 开发引入 | / |MT5_ID4146_for_PyTorch/url.ini |https://moon-staging.huggingface.co|测试huggingface的api可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/merges.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的merges.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/vocab.txt|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的vocab.txt的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py|MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py |https://huggingface.co/{{cookiecutter.checkpoint_identifier}}/resolve/main/tokenizer.json|"{{cookiecutter.checkpoint_identifier}}"模型在开源社区上的tokenizer.json的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/clip/test_modeling_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/clip/test_modeling_tf_clip.py |http://images.cocodataset.org/val2017/000000039769.jpg|clip模型测试函数在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_classification.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分类pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg",|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_image_segmentation.py |http://images.cocodataset.org/val2017/000000039769.jpg",|图片分割pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg",|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/tests/pipelines/test_pipelines_zero_shot_object_detection.py|MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_object_detection.py |http://images.cocodataset.org/val2017/000000039769.jpg",|目标检测pipeline在开源社区上的验证集输入引用链接| -| 开发引入 | / |MT5_ID4146_for_PyTorch/url.ini |https://bogus|bogus音视频开源下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx|MT5_ID4146_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/master/|model_list检查开源社区url链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py|MT5_ID4146_for_PyTorch/transformers/utils/check_copies.py |https://huggingface.co/docs/transformers/|model_list检查开源社区url链接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4|"CoLA"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8|"SST"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc|"MRPC"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5|"QQP"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5|"STS"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce|"MNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df|"SNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601|"QNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb|"RTE"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf|"WNLI"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://storage.googleapis.com/mtl-sentence-representations.appspot.com |"diagnostic"任务数据集在开源社区上的下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt|MRPC任务训练集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/download_glue_data.py|MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py |https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt|MRPC任务测试集分类文件列表在开源社区上的txt下载链接| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service_deprecated.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service_doc_tests.py |https://api.github.com/repos/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service.py |https://github.com/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/notification_service_doc_tests.py|MT5_ID4146_for_PyTorch/transformers/utils/notification_service.py |https://api.github.com/repos/huggingface/transformers/actions/runs |测试huggingface相应run_id可用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/docs/source/ja/index.mdx|MT5_ID4146_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/master/model_doc|transformers_model_doc开源社区url连接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/utils/release.py|MT5_ID4146_for_PyTorch/transformers/utils/release.py |https://huggingface.co/docs/transformers/model_doc|transformers_model_doc开源社区url连接引用| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_table.py | MT5_ID4146_for_PyTorch/transformers/utils/update_metadata.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/README.md | MT5_ID4146_for_PyTorch/transformers/utils/update_metadata.py | https://github.com/huggingface/transformers/commit/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | MT5_ID4146_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | MT5_ID4146_for_PyTorch/transformers/utils/notification_service_doc_tests.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | MT5_ID4146_for_PyTorch/transformers/utils/notification_service_deprecated.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/notification_service.py | MT5_ID4146_for_PyTorch/transformers/utils/notification_service.py | https://github.com/huggingface/transformers/actions/runs/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/get_github_job_time.py | MT5_ID4146_for_PyTorch/transformers/utils/notification_service.py | https://api.github.com/repos/huggingface/transformers/actions/runs/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://download.microsoft.com/download/D/4/6/D46FF87A-F6B9-4252-AA8B-3604ED519838/MSRParaphraseCorpus.msi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2 | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/download_glue_data.py | MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_table.py | MT5_ID4146_for_PyTorch/transformers/utils/check_table.py | https://stackoverflow.com/questions/29916065/how-to-do-camelcase-split-in-python | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/check_repo.py | MT5_ID4146_for_PyTorch/transformers/utils/check_repo.py | https://github.com/huggingface/doc-builder | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/xlm_roberta_xl/test_modeling_xlm_roberta_xl.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/xlm/test_tokenization_xlm.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/wavlm/test_modeling_wavlm.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | MT5_ID4146_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/hf-internal-testing/processor_with_lm/tree/main | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | MT5_ID4146_for_PyTorch/transformers/tests/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | MT5_ID4146_for_PyTorch/transformers/tests/wav2vec2/test_tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/wav2vec2/test_modeling_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/wav2vec2/test_modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/tests/vision_text_dual_encoder/test_modeling_flax_vision_text_dual_encoder.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | MT5_ID4146_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | MT5_ID4146_for_PyTorch/transformers/tests/vision_encoder_decoder/test_modeling_tf_vision_encoder_decoder.py | https://github.com/huggingface/transformers/pull/14016 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/repo_utils/test_check_copies.py | MT5_ID4146_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/model_doc/albert.ht | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/utils/test_model_card.py | MT5_ID4146_for_PyTorch/transformers/tests/utils/test_model_card.py | https://arxiv.org/pdf/1810.03993.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/tests/utils/test_add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/unispeech_sat/test_modeling_unispeech_sat.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/unispeech/test_modeling_unispeech.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/trainer/test_trainer.py | MT5_ID4146_for_PyTorch/transformers/tests/trainer/test_trainer.py | https://github.com/huggingface/transformers/issues/12970 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/tokenization/test_tokenization_fast.py | MT5_ID4146_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/transformers/pull/12550 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/tokenization/test_tokenization_fast.py | MT5_ID4146_for_PyTorch/transformers/tests/tokenization/test_tokenization_fast.py | https://github.com/huggingface/tokenizers/issues/537 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | MT5_ID4146_for_PyTorch/transformers/tests/test_modeling_tf_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | MT5_ID4146_for_PyTorch/transformers/tests/test_modeling_common.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | MT5_ID4146_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/14859 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | MT5_ID4146_for_PyTorch/transformers/tests/test_modeling_common.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/tests/tapas/test_tokenization_tapas.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/tapas/test_modeling_tf_tapas.py | MT5_ID4146_for_PyTorch/transformers/tests/tapas/test_modeling_tf_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/tapas/test_modeling_tf_tapas.py | MT5_ID4146_for_PyTorch/transformers/tests/tapas/test_modeling_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/models/segmented_tensor_test.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/t5/test_modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/sew_d/test_modeling_sew_d.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/sew/test_modeling_sew.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | MT5_ID4146_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/run_glue_model_parallelism.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/roberta/test_tokenization_roberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/roberta/test_modeling_roberta.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/reformer/test_tokenization_reformer.py | MT5_ID4146_for_PyTorch/transformers/tests/reformer/test_tokenization_reformer.py | https://github.com/huggingface/transformers/pull/11737#issuecomment-850769064 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/perceiver/test_modeling_perceiver.py | MT5_ID4146_for_PyTorch/transformers/tests/reformer/test_modeling_reformer.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/tests/realm/test_tokenization_realm.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/tests/plbart/test_tokenization_plbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_zero_shot.py | MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13846 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_zero_shot.py | MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_zero_shot.py | https://github.com/huggingface/transformers/issues/13381#issuecomment-912343499 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/pipelines/test_pipelines_token_classification.py | MT5_ID4146_for_PyTorch/transformers/tests/pipelines/test_pipelines_token_classification.py | https://github.com/huggingface/transformers/pull/4987 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/phobert/test_tokenization_phobert.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | MT5_ID4146_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/perceiver/test_modeling_perceiver.py | MT5_ID4146_for_PyTorch/transformers/tests/perceiver/test_modeling_perceiver.py | https://github.com/pytorch/pytorch/issues/36035 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_tokenization_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/pegasus/test_tokenization_pegasus.py | https://github.com/google-research/bigbird/raw/master/bigbird/vocab/pegasus.model | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/pegasus/test_modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/openai/test_tokenization_openai.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/tests/mbart50/test_tokenization_mbart50.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/tests/mbart/test_tokenization_mbart.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/mbart/test_modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/marian/test_modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/plbart/test_tokenization_plbart.py | MT5_ID4146_for_PyTorch/transformers/tests/m2m_100/test_tokenization_m2m_100.py | https://gist.github.com/sshleifer/cba08bc2109361a74ac3760a7e30e4f4 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/led/test_modeling_led.py | MT5_ID4146_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/allenai/longformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/led/test_modeling_led.py | MT5_ID4146_for_PyTorch/transformers/tests/led/test_modeling_led.py | https://github.com/huggingface/transformers/pull/9278#issue-544709661 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/layoutxlm/test_processor_layoutxlm.py | MT5_ID4146_for_PyTorch/transformers/tests/layoutxlm/test_processor_layoutxlm.py | https://www.industrydocuments.ucsf.edu/docs/snbx0223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/tests/layoutlmv2/test_tokenization_layoutlmv2.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/test_modeling_common.py | MT5_ID4146_for_PyTorch/transformers/tests/layoutlmv2/test_modeling_layoutlmv2.py | https://stackoverflow.com/questions/9541025/how-to-copy-a-python-class | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/ibert/test_modeling_ibert.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/hubert/test_modeling_tf_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/hubert/test_modeling_hubert.py | MT5_ID4146_for_PyTorch/transformers/tests/hubert/test_modeling_hubert.py | https://github.com/pytorch/fairseq/pull/3572 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/gpt2/test_tokenization_gpt2.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/fsmt/test_tokenization_fsmt.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/fixtures/tests_samples/wiki_text/wiki_00 | MT5_ID4146_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/fixtures/tests_samples/wiki_text/wiki_00 | MT5_ID4146_for_PyTorch/transformers/tests/fixtures/tests_samples/wiki_text/wiki_00 | https://en.wikipedia.org/wiki?curid=25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | MT5_ID4146_for_PyTorch/transformers/tests/encoder_decoder/test_modeling_tf_encoder_decoder.py | https://github.com/huggingface/transformers/pull/13222/commits/dbb3c9de76eee235791d2064094654637c99f36d#r697304245 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/levit/test_modeling_levit.py | MT5_ID4146_for_PyTorch/transformers/tests/deit/test_modeling_deit.py | https://github.com/huggingface/transformers/issues/11780 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/deepspeed/test_deepspeed.py | MT5_ID4146_for_PyTorch/transformers/tests/deepspeed/test_deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/deberta/test_tokenization_deberta.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_modeling_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_text.py | https://github.com/huggingface/transformers/issues/1761 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/unispeech_sat/test_modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/tests/data2vec/test_modeling_data2vec_audio.py | https://github.com/pytorch/fairseq/issues/3227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/ctrl/test_tokenization_ctrl.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/blenderbot_small/test_modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/blenderbot/test_modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/big_bird/test_tokenization_big_bird.py | MT5_ID4146_for_PyTorch/transformers/tests/big_bird/test_tokenization_big_bird.py | https://github.com/google-research/bigbird/blob/master/bigbird/vocab/gpt2.model?raw=true | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/roberta/test_tokenization_roberta.py | MT5_ID4146_for_PyTorch/transformers/tests/bertweet/test_tokenization_bertweet.py | https://github.com/rsennrich/subword-nmt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/mobilebert/test_tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/tests/bert/test_tokenization_bert.py | https://github.com/huggingface/tokenizers/issues/340 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/barthez/test_tokenization_barthez.py | MT5_ID4146_for_PyTorch/transformers/tests/barthez/test_tokenization_barthez.py | https://github.com/huggingface/transformers/issues/11457 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/pegasus/test_modeling_flax_pegasus.py | MT5_ID4146_for_PyTorch/transformers/tests/bart/test_modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/gpu_memory_allocation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/auto/test_tokenization_auto.py | MT5_ID4146_for_PyTorch/transformers/tests/auto/test_tokenization_auto.py | https://github.com/huggingface/transformers/pull/13251 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_fast_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/tokenization_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/configuration_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_example_script/{{cookiecutter.directory_name}}/run_{{cookiecutter.example_shortcut}}.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/fx.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/55888 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/pt/multilingual.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args_tf.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/pt/multilingual.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/tree/master/examples | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/perf_train_gpu_many.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/facebookresearch/fairscale | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/deepspeed.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/microsoft/deepspeed | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/training_args.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/huggingface/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_pt_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_pt_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_pt_utils.py | https://github.com/pytorch/pytorch/issues/16266 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/tokenization_utils_base.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/tokenization_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://en.wikipedia.org/wiki/Trie | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/tokenization_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/tokenization_utils.py | https://github.com/huggingface/transformers/issues/1133 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/fastai/fastai/blob/master/tests/utils/text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/64789046/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/34333710/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/runner.py#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://github.com/pytest-dev/pytest/blob/897f151e/src/_pytest/terminal.py#L814 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://stackoverflow.com/a/59041913/9201239 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/testing_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://docs.python.org/3/library/asyncio-subprocess.html#asyncio.asyncio.subprocess.Process.wait | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/zero_shot_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_image_classification.py | https://huggingface.co/models?filter=zero-shot-image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/zero_shot_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/zero_shot_classification.py | https://huggingface.co/models?search=nli | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/token_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/token_classification.py | https://huggingface.co/models?filter=token-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=text2text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=summarization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text2text_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text2text_generation.py | https://huggingface.co/models?filter=translation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text_generation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text_generation.py | https://github.com/huggingface/transformers/issues/14033#issuecomment-948385227 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/text_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/text_classification.py | https://huggingface.co/models?filter=text-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/table_question_answering.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/table_question_answering.py | https://huggingface.co/models?filter=table-question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/question_answering.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://huggingface.co/models?filter=question-answering | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/question_answering.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/question_answering.py | https://github.com/facebookresearch/DrQA | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/object_detection.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/object_detection.py | https://huggingface.co/models?filter=object-detection | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/image_segmentation.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/image_segmentation.py | https://huggingface.co/models?filter=image-segmentation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/image_classification.py | https://huggingface.co/models?filter=image-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/fill_mask.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/fill_mask.py | https://github.com/huggingface/transformers/pull/10222 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/feature_extraction.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/conversational.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/conversational.py | https://huggingface.co/models?filter=conversational | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/base.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://huggingface.co/transformers/main_classes/pipelines.html#pipeline-batching | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/pipelines/audio_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/audio_classification.py | https://huggingface.co/models?filter=audio-classification | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://arxiv.org/abs/1904.09237 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization_tf.py | https://github.com/OpenNMT/OpenNMT-tf/blob/master/opennmt/optimizers/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py#L37 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization.py | https://discuss.huggingface.co/t/t5-finetuning-tips/684/3 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/optimization.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/optimization.py | https://github.com/huggingface/transformers/blob/8395f14de6068012787d83989c3627c3df6a252b/src/transformers/optimization.py#L505 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/kernels/yoso/fast_lsh_cumulation_cuda.cu | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation_cuda.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/kernels/yoso/fast_lsh_cumulation.cu | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/fast_lsh_cumulation.cu | https://github.com/mlpen/YOSO/blob/main/encoders/backbones/efficient_attentions/yoso/yoso_v1/cuda/fast_lsh_cumulation.cu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/yoso.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/convert_yoso_pytorch_to_pytorch.py | https://github.com/mlpen/YOSO | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/models?filter=yoso | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yoso/configuration_yoso.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/configuration_yoso.py | https://huggingface.co/uw-madison/yoso-4096 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/tokenization_xlnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlnet.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlnet.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://huggingface.co/models?filter=xlnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/tokenization_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/xlnet-large-cased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://huggingface.co/transformers/quickstart.html#using-the-past | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlnet/configuration_xlnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/configuration_xlnet.py | https://github.com/zihangdai/xlnet/issues/41#issuecomment-505102587 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/models?filter=xlm-roberta-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/configuration_xlm_roberta_xl.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/tokenization_xlm_roberta.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-roberta.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://huggingface.co/models?filter=xlm-roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/tokenization_xlm_prophetnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm-prophetnet.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_prophetnet/modeling_xlm_prophetnet.py | https://huggingface.co/models?filter=xprophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/blob/master/tools/lowercase_and_remove_accent.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/normalise-romanian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/rsennrich/wmt16-scripts/blob/master/preprocess/remove-diacritics.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kyt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/awesome-transformers.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/PyThaiNLP/pythainlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/chezou/Mykytea-python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/neubig/kytea | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/fxsjy/jieba | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/tokenization_xlm.py | https://github.com/facebookresearch/XLM/tree/master/tools | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/xlm.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://huggingface.co/models?filter=xlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | https://huggingface.co/xlm-mlm-en-2048 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/configuration_xlm.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/tokenization_xglm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/configuration_xglm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/models?filter=xglm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xglm/tokenization_xglm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://huggingface.co/facebook/xglm-564M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/configuration_xglm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://github.com/pytorch/pytorch/issues/32590 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/convert_wavlm_original_pytorch_checkpoint_to_pytorch.py | https://github.com/microsoft/unilm/commit/b94ec76c36f02fb2b0bf0dcb0b8554a2185173cd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wavlm/modeling_wavlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/models?filter=wavlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2_with_lm/processing_wav2vec2_with_lm.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2_phoneme/tokenization_wav2vec2_phoneme.py | https://github.com/bootphon/phonemizer#readme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/wav2vec2_with_lm/test_processor_wav2vec2_with_lm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/datasets/common_voice/viewer/en/train | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/tokenization_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/models?search=lv60 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/feature_extraction_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-large-960h-lv60-self | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/wav2vec2_conformer/modeling_wav2vec2_conformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/models?filter=wav2vec2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://huggingface.co/facebook/wav2vec2-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/configuration_wav2vec2.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/modeling_vit_mae.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://huggingface.co/models?filter=vit_mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mae.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://github.com/facebookresearch/mae | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/models?filter=vit-mae | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_mae/configuration_vit_mae.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/configuration_vit_mae.py | https://huggingface.co/facebook/vit-mae-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/facebookresearch/dino/issues/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit_hybrid/modeling_vit_hybrid.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://github.com/facebookresearch/dino/blob/de9ee3df6cf39fac952ab558447af1fa1365362a/vision_transformer.py#L174 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/models?filter=vit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/configuration_vit.py | https://huggingface.co/google/vit-base-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/models?filter=visual_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/visual_bert/configuration_visual_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/configuration_visual_bert.py | https://huggingface.co/uclanlp/visualbert-vqa-coco-pre | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/vision-text-dual-encoder.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/vision-text-dual-encoder.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-12.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02-10.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122-02.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/convert_trocr_unilm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/convert_trocr_unilm_to_pytorch.py | https://fki.tic.heia-fr.ch/static/img/a01-122.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://huggingface.co/models?filter=vilt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://github.com/jnhwkim/ban-vqa/blob/master/train.py#L19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/modeling_vilt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_1.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vilt/configuration_vilt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/configuration_vilt.py | https://huggingface.co/dandelin/vilt-b32-mlm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/modeling_van.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://huggingface.co/models?filter=van | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/van/modeling_van.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://arxiv.org/abs/2106.13797 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/van.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/convert_van_to_pytorch.py | https://github.com/Visual-Attention-Network/VAN-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/wav2vec2/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/models?filter=unispeech_sat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/2006.11477.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/pdf/1611.01144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/models?filter=unispeech | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/trocr.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/models?filter=trocr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/trocr/configuration_trocr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://huggingface.co/microsoft/trocr-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/configuration_trocr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/dbe6a7a9ff1a364a8706bf5df58a1ca96d2fd9da/torch/nn/modules/adaptive.py#L138 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl_utilities.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/adaptive.p | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/mem_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/transfo-xl.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/transfo-xl.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://huggingface.co/models?filter=transfo-xl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://github.com/huggingface/transformers/issues/3310 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/convert_transfo_xl_original_tf_checkpoint_to_pytorch.py | https://stackoverflow.com/questions/2121874/python-pickling-after-changing-a-modules-directory/2121918#2121918 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/transfo_xl/tokenization_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/configuration_transfo_xl.py | https://huggingface.co/transfo-xl-wt103 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/4908213eb4df7aa988573350278b44c4dbe3f71b/tapas/experiments/prediction_utils.py#L288 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/constants.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/text_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L414 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/Microsoft/DynSP/blob/master/util.py#L293 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/tokenization_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/tokenization_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/number_annotation_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/tapas.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/modeling_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/modeling_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://huggingface.co/models?filter=tapas | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/run_task_main.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/blob/master/tapas/utils/hparam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/tapas/configuration_tapas.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/configuration_tapas.py | https://github.com/google-research/tapas/tree/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5_fast.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/tokenization_t5.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/open_model_proposals/ADD_BIG_BIRD.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://medium.com/huggingface/from-tensorflow-to-pytorch-265f40ef2a28 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.07467 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L1624 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/master/mesh_tensorflow/transformer/transformer_layers.py#L56 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L89 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/attention.py#L136 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mt5/modeling_mt5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/layers.py#L666 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://github.com/tensorflow/mesh/blob/fa19d69eafc9a482aff0b59ddd96b025c0cb207d/mesh_tensorflow/transformer/transformer.py#L586 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/configuration_t5.py | https://huggingface.co/t5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/modeling_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/modeling_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/models?filter=swin | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/swin/configuration_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/configuration_swin.py | https://huggingface.co/microsoft/swin-tiny-patch4-window7-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/tokenization_splinter.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/configuration_splinter.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/models?filter=splinter | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/splinter/tokenization_splinter_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/configuration_splinter.py | https://huggingface.co/tau/splinter-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text_2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text_2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/models?filter=speech2text2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/configuration_speech_to_text_2.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/tokenization_speech_to_text.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/speech_to_text.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/models?filter=speech_to_text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/speech_to_text/configuration_speech_to_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://huggingface.co/facebook/s2t-small-librispeech-asr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/configuration_speech_to_text.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/models?filter=sew-d | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/configuration_sew_d.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://huggingface.co/asapp/sew-d-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/configuration_sew_d.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/sew_d/modeling_sew_d.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/models?filter=sew | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/utils/create_dummy_models.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://huggingface.co/asapp/sew-tiny-100k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/configuration_sew.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/modeling_segformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/segformer/modeling_segformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/models?filter=segformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/configuration_segformer.py | https://huggingface.co/nvidia/segformer-b0-finetuned-ade-512-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_utils.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/tokenization_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://pypi.org/project/rjieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/tokenization_roformer.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://kexue.fm/archives/8265 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/modeling_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/models?filter=roformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roformer/configuration_roformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/configuration_roformer.py | https://huggingface.co/junnyu/roformer_chinese_base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/tokenization_roberta.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/roberta.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/roberta.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://huggingface.co/models?filter=roberta | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/retribert/modeling_retribert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://huggingface.co/models?filter=retribert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/modeling_resnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://huggingface.co/models?filter=resnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/resnet/configuration_resnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/resnet/configuration_resnet.py | https://huggingface.co/microsoft/resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/tokenization_rembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rembert/configuration_rembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/configuration_rembert.py | https://huggingface.co/models?filter=rembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/tokenization_reformer.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/reformer.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://huggingface.co/models?filter=reformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/1509.02897.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/pdf/2001.04451.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/reformer/modeling_reformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/lucidrains/reformer-pytorch/blob/master/reformer_pytorch/reversible.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/tokenization_realm.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/configuration_realm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/models?filter=realm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/realm/tokenization_realm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/configuration_realm.py | https://huggingface.co/google/realm-cc-news-pretrained-embedder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/modeling_tf_rag.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://arxiv.org/pdf/2005.11401.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/rag/modeling_tf_rag.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://stackoverflow.com/questions/52129909/tensorflow-equivalent-of-torch-gather | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2005.11401 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/quantization-qdqbert/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/NVIDIA/TensorRT/tree/master/tools/pytorch-quantization | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/configuration_qdqbert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/tokenization_prophetnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/prophetnet.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://huggingface.co/models?filter=prophetnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/prophetnet.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://github.com/microsoft/ProphetNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/configuration_prophetnet.py | https://arxiv.org/abs/1910.10683 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/modeling_poolformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/poolformer.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | https://github.com/sail-sg/poolformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/modeling_poolformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/models?filter=poolformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/poolformer/configuration_poolformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/configuration_poolformer.py | https://huggingface.co/sail/poolformer_s12 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/tokenization_plbart.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/models?filter=plbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/plbart/configuration_plbart.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://huggingface.co/uclanlp/plbart-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/configuration_plbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://discuss.pytorch.org/t/is-there-any-layer-like-tensorflows-space-to-depth-function/3487/15 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://gist.github.com/sumanmichael/4de9dee93f972d47c80c4ade8e149ea6 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/modeling_perceiver.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/models?filter=perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/perceiver/configuration_perceiver.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/configuration_perceiver.py | https://huggingface.co/deepmind/language-perceiver | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus_fast.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://arxiv.org/pdf/1912.08777.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google-research/pegasus/blob/939830367bcf411193d2b5eca2f2f90f3f9260ca/pegasus/ops/pretrain_parsing_ops.cc#L66 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/tokenization_pegasus.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/pegasus.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/models?filter=pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/configuration_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://huggingface.co/google/pegasus-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/configuration_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/openai-gpt.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/openai-gpt.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://huggingface.co/models?filter=openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/openai/configuration_openai.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/configuration_openai.py | https://huggingface.co/openai-gpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/models?filter=nystromformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/nystromformer/configuration_nystromformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/configuration_nystromformer.py | https://huggingface.co/uw-madison/nystromformer-512 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mt5.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mt5/configuration_mt5.py | https://huggingface.co/google/mt5-small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/tokenization_mpnet.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://huggingface.co/models?filter=mobilebert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/abs/2004.02984 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/modeling_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://arxiv.org/pdf/2004.02984.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/mmbt/modeling_mmbt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://github.com/facebookresearch/mmbt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mluke/tokenization_mluke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/perf_train_gpu_one.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_gpt2/convert_megatron_gpt2_checkpoint.py | https://github.com/huggingface/transformers/issues/13906 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/modeling_megatron_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://huggingface.co/models?filter=megatron_bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/perf_train_gpu_one.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/microsoft/Megatron-DeepSpeed/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/convert_megatron_bert_checkpoint.py | https://github.com/NVIDIA/Megatron-LM/blob/v2.4/megatron/checkpointing.py#L209 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/megatron_bert/configuration_megatron_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://huggingface.co/nvidia/megatron-bert-uncased-345m | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/configuration_megatron_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart50/tokenization_mbart50.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/tokenization_mbart.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mbart.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/mbart.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/models?filter=mbart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/translation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://huggingface.co/facebook/mbart-large-cc25 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/configuration_mbart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/modeling_maskformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://arxiv.org/abs/2107.06278 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/maskformer.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | https://github.com/facebookresearch/MaskFormer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/modeling_maskformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/models?filter=maskformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/datasets/scene_parse_150 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/maskformer/configuration_maskformer.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/microsoft/swin-base-patch4-window12-384-in22k | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/configuration_maskformer.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/tokenization_marian.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/modeling_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/modeling_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://huggingface.co/models?search=Helsinki-NLP | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/convert_marian_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://en.wikipedia.org/wiki/Insular_Celtic_languages | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | https://github.com/Helsinki-NLP/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/romanian_postprocessing.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | git@github.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/models?filter=marian | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/marian/tokenization_marian.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://huggingface.co/Helsinki-NLP/opus-mt-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/configuration_marian.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/tokenization_m2m_100.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/configuration_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/models?filter=m2m_100 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/m2m_100/tokenization_m2m_100.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://huggingface.co/facebook/m2m100_418M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/configuration_m2m_100.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mluke/tokenization_mluke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2778 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/tokenization_luke.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/luke/modeling_luke.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://huggingface.co/models?filter=luke | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/configuration_luke.py | https://arxiv.org/abs/2010.01057 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/longformer.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/longformer.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://huggingface.co/models?filter=longformer | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/configuration_longformer.py | https://huggingface.co/roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/2004.05150 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/modeling_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://github.com/huggingface/transformers/blob/ac3cb660cad283163f7c73cad511124e845ca388/src/transformers/models/bart/modeling_bart.py#L1153 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/models?filter=led | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/led/configuration_led.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://huggingface.co/allenai/led-base-16384 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/configuration_led.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/pull/2674 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/tokenization_layoutlmv2.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/switch_transformers/modeling_switch_transformers.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/tensorflow/mesh/blob/0cb87fe07da627bf0b7e60475d59f95ed6b5be3d/mesh_tensorflow/transformer/transformer_layers.py#L593 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://guillaumejaume.github.io/FUNS | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://github.com/clovaai/co | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/tasks/document_question_answering.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://rrc.cvc.uab.es/?ch=17 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/models?filter=layoutlmv2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/tasks/document_question_answering.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://huggingface.co/microsoft/layoutlmv2-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/configuration_layoutlmv2.py | https://github.com/microsoft/unilm/blob/master/layoutlmft/layoutlmft/models/layoutlmv2/detectron2_config.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://www.cs.cmu.edu/~aharley/rvl-cdip/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/layoutlmv3/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://guillaumejaume.github.io/FUNSD/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/layoutlm.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://rrc.cvc.uab.es/?ch=13 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutlm/tokenization_layoutlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/configuration_layoutlm.py | https://huggingface.co/microsoft/layoutlm-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://huggingface.co/models?filter=imagegpt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/configuration_ibert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://github.com/pytorch/fairseq/blob/e0788f7007a8473a76db573985031f3c94201e79/fairseq/data/data_utils.py#L376 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/addons/api_docs/python/tfa/layers/GroupNormalization | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/probability/api_docs/python/tfp/layers/weight_norm/WeightNorm | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07447 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/audio-classification/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/models?filter=hubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/audio-classification/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://huggingface.co/facebook/hubert-base-ls960 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/configuration_hubert.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/modeling_tf_gptj.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://huggingface.co/models?filter=gptj | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/codegen/modeling_codegen.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gptj/configuration_gptj.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/models?filter=gpt_j | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/es/tasks/language_modeling.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/configuration_gptj.py | https://huggingface.co/EleutherAI/gpt-j-6B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/tokenization_gpt2.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://huggingface.co/models?filter=gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/gpt2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://openai.com/blog/better-language-models/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/imagegpt/modeling_imagegpt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/model/gpt_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/configuration_gpt2.py | https://huggingface.co/gpt2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neox/configuration_gpt_neox.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/codegen/modeling_codegen.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://github.com/EleutherAI/gpt-neo/blob/89ce74164da2fb16179106f54e2269b5da8db333/models/gpt2/gpt2.py#L179 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neox/configuration_gpt_neox.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/models?filter=gpt_neo | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/gpt_neo/configuration_gpt_neo.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/configuration_gpt_neo.py | https://huggingface.co/EleutherAI/gpt-neo-1.3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/funnel/tokenization_funnel_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/configuration_funnel.py | https://huggingface.co/funnel-transformer/small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/replace-unicode-punctuation.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/moses-smt/mosesdecoder/blob/master/scripts/tokenizer/remove-non-printing-char.perl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/tokenization_fsmt.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://github.com/pytorch/fairseq/tree/master/examples/wmt19 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fsmt/modeling_fsmt.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://huggingface.co/models?filter=fsmt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/eval-facebook-wmt19.sh | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/tokenization_fnet.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/google-research/google-research/blob/master/f_net/fourier.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/master/generated/torch.vmap.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://arxiv.org/abs/2105.03824 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/modeling_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/models?filter=fnet | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/fnet/tokenization_fnet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/configuration_fnet.py | https://huggingface.co/google/fnet-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/tokenization_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/alvations/sacremoses | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/flaubert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/flaubert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://huggingface.co/models?filter=flaubert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/xlm/configuration_xlm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/configuration_flaubert.py | http://huggingface.co/transformers/multilingual.html#xlm-language-embeddings | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/electra.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/electra.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://huggingface.co/models?filter=electra | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/electra/configuration_electra.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://huggingface.co/google/electra-small-discriminator | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/configuration_electra.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/huggingface/transformers/commit/614fef1691edb806de976756d4948ecbcd0c0ca3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/convert_dpr_original_checkpoint_to_pytorch.py | https://github.com/facebookresearch/DPR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/configuration_dpr.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/distilbert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/google-research/bert | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/distilbert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://huggingface.co/models?filter=distilbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/configuration_distilbert.py | https://huggingface.co/distilbert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/backbone.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/modeling_yolos.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/issues/108#issuecomment-650269223 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/models/matcher.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/detr.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://giou.stanford.edu/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://github.com/facebookresearch/detr/blob/master/util/misc.py#L306 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/util/box_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/yolos/image_processing_yolos.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/pytorch/pytorch/issues/50276 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/image_transforms.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/cocodataset/panopticapi/blob/master/panopticapi/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L33 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/datasets/coco.py#L50 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/detr.py#L258 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L218 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/image_processing_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/feature_extraction_detr.py | https://github.com/facebookresearch/detr/blob/master/models/segmentation.py#L241 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/modeling_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/models?filter=detr | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://huggingface.co/facebook/detr-resnet-50 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/detr/configuration_detr.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/configuration_detr.py | https://rwightman.github.io/pytorch-image-models/#load-a-pretrained-model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/modeling_deit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://arxiv.org/abs/2111.09886 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | https://github.com/facebookresearch/deit/blob/ab5715372db8c6cad5740714b2216d55aeae052e/datasets.py#L103 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/modeling_deit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/models?filter=deit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deit/configuration_deit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/configuration_deit.py | https://huggingface.co/facebook/deit-base-distilled-patch16-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/tokenization_deberta_v2.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://huggingface.co/models?filter=deberta-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/configuration_deberta_v2.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/modeling_tf_deberta.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://huggingface.co/models?filter=DeBERTa | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deberta/tokenization_deberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/configuration_deberta.py | https://huggingface.co/microsoft/deberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://huggingface.co/models?filter=data2vec-text | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/data2vec.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv1d.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/data2vec.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_text_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_text.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/convert_data2vec_audio_original_pytorch_checkpoint_to_pytorch.py | https://github.com/pytorch/fairseq/blob/main/examples/data2vec/models/data2vec_audio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/configuration_data2vec_text.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://huggingface.co/facebook/data2vec-text-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_text.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/models?filter=data2vec-audio | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/data2vec/modeling_data2vec_audio.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://huggingface.co/facebook/data2vec-audio-base-960h | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/configuration_data2vec_audio.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/ctrl.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/ctrl.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://huggingface.co/models?filter=ctrl | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/cpm/tokenization_cpm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/cpm/tokenization_cpm.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://pypi.org/project/jieba/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/cpm/tokenization_cpm.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convnextv2.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/facebookresearch/ConvNeXt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/models?filter=convnext | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/configuration_convnext.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/configuration_convnext.py | https://huggingface.co/facebook/convnext-tiny-224 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/convbert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/models?filter=convbert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convbert/tokenization_convbert_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/configuration_convbert.py | https://huggingface.co/YituTech/conv-bert-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/tokenization_clip_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip_fast.py | https://github.com/huggingface/tokenizers/issues/872 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/tokenization_clip.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_tf_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_tf_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://sachinruk.github.io/blog/pytorch/pytorch%20lightning/loss%20function/gpu/2021/03/07/CLIP.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/modeling_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://github.com/openai/CLIP/blob/cfcffb90e69f37bf2ff1e988237a0fbe41f33c04/clip/model.py#L324 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/contrastive-image-text/run_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/models?filter=clip | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/clip/configuration_clip.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/configuration_clip.py | https://huggingface.co/openai/clip-vit-base-patch32 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/tokenization_canine.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/tokenization_canine.py | https://github.com/google-research/language/blob/master/language/canine/special_codepoints.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/google-research/big_transfer/blob/49afe42338b62af9fbe18f0258197a33ee578a6b/bit_tf2/models.py#L36-L38 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/canine/modeling_canine.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/models?filter=canine | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/canine.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/configuration_canine.py | https://huggingface.co/google/canine-s | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/layoutxlm/tokenization_layoutxlm_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=BPE#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/tokenization_camembert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/camembert/modeling_camembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/camembert/modeling_camembert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://huggingface.co/models?filter=camembert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/byt5/tokenization_byt5.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/byt5/tokenization_byt5.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/9fd7b14a769417be33bc6c850f9598764913c833/t5/data/preprocessors.py#L2117 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/deprecated/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bort/convert_bort_original_gluonnlp_checkpoint_to_pytorch.py | https://github.com/alexa/bort/blob/master/bort/bort.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/models?filter=blenderbot_small | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://huggingface.co/facebook/blenderbot_small-90M | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot_small/configuration_blenderbot_small.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/models?filter=blenderbot | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/blenderbot/configuration_blenderbot.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://huggingface.co/facebook/blenderbot-3B | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/configuration_blenderbot.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/models?filter=bigbird_pegasus | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://huggingface.co/google/bigbird-pegasus-large-arxiv | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/configuration_bigbird_pegasus.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/tokenization_big_bird.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/modeling_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/modeling_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/models?filter=big_bird | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/big_bird/tokenization_big_bird.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/configuration_big_bird.py | https://huggingface.co/google/bigbird-roberta-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | cgpotts@stanford.edu | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | ewan@inf.ed.ac.uk | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://nltk.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/nltk/nltk/issues/2409 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | http://en.wikipedia.org/wiki/List_of_emoticons | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://gist.github.com/winzig/8894715 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | foo.na@example.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://github.com/scrapy/w3lib/blob/master/w3lib/html.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bertweet/tokenization_bertweet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bertweet/tokenization_bertweet.py | https://en.wikipedia.org/wiki/ISO/IEC_8859-1#Similar_character_sets | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://pypi.org/project/fugashi/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/ipadic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_japanese/tokenization_bert_japanese.py | https://github.com/polm/unidic-py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/tokenization_bert_generation.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://arxiv.org/abs/1907.12461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/configuration_bert_generation.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert_fast.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/mobilebert/tokenization_mobilebert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://github.com/huggingface/transformers/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/tokenization_bert.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow/transformer/transformer_layers.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/bert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/models?filter=bert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://huggingface.co/bert-base-uncased | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/configuration_bert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/beit/modeling_beit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/models/vision_text_dual_encoder/test_modeling_vision_text_dual_encoder.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/b9bd960a032c75ca6b808ddeed76bee5f3ed4972/timm/models/layers/helpers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/donut/modeling_donut_swin.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/vit/modeling_tf_vit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/semantic-segmentation/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | http://images.cocodataset.org/val2017/000000039769.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/upernet.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1807.10221 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/upernet/modeling_upernet.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://arxiv.org/abs/1411.4038 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/convnext/convert_convnext_to_pytorch.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | https://github.com/google-research/big_transfer/issues/18 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/beit/modeling_beit.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/models?filter=beit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bartpho/tokenization_bartpho.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/barthez/tokenization_barthez.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart_fast.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/roberta/tokenization_roberta_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/tokenization_bart.py | https://docs.python.org/3/library/stdtypes.html#bytes.decode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://github.com/google/flax/blob/491ce18759622506588784b4fca0e4bf05f8c8cd/flax/linen/attention.py#L252 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_bart_dlm_flax.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/models?filter=bart | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/configuration_bart.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/tokenization_pegasus_fast.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://huggingface.co/docs/tokenizers/python/latest/components.html?highlight=unigram#models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert_fast.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/code_llama/tokenization_code_llama.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/tokenization_albert.py | https://github.com/google/sentencepiece/tree/master/python | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/albert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/albert/modeling_tf_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://github.com/google-research/albert/blob/master/modeling.py#L971-L993 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#just-in-time-compilation-jit | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#automatic-differentiation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#vectorization-vmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://jax.readthedocs.io/en/latest/jax.html#parallelization-pmap | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/albert.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://huggingface.co/models?filter=albert | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/albert/configuration_albert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://huggingface.co/albert-xxlarge-v2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/question-answering/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/configuration_albert.py | https://arxiv.org/abs/2009.13658 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/tensorflow/mesh/blob/8d2465e9bc93129b913b5ccc6a59aa97abd96ec6/mesh_tensorflow | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://arxiv.org/pdf/2001.08361.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/huggingface/transformers/pull/11471 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/transformers/installation.html#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://www.tensorflow.org/tfx/serving/serving_basic | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/tensorflow/blob/00fad90125b18b80fe054de1055770cfb8fe4ba3/tensorflow/python/keras/engine/network.py#L1339-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/pegasus/modeling_tf_pegasus.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/tensorflow/models/blob/a009f4fb9d2fc4949e32192a944688925ef78659/official/transformer/v2/embedding_layer.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_pytorch_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://github.com/tensorflow/tensorflow/blob/ee16fcac960ae660e0e4496658a366e2f745e1f0/tensorflow/python/keras/engine/network.py#L1352-L1357 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/deepmind/jmp/blob/3a8318abc3292be38582794dbf7b094e6583b192/jmp/_src/policy.py#L27 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_flax_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_utils.py | https://github.com/google/flax/issues/1261 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/tensorflow/image-classification/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.or | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/modelcard.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/modelcard.py | https://arxiv.org/abs/1810.03993 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/modelcard.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/integration_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://github.com/huggingface/transformers/issues/11565 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/integration_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://www.tensorflow.org/tensorboard | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://www.wandb.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://docs.wandb.ai/integrations/huggingface | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/site/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/trainer_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://www.comet.ml/docs/python-sdk/advanced/#comet-configuration-variables | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://pypi.org/project/azureml-sdk/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://www.mlflow.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/main_classes/callback.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://neptune.ai | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/image_utils.py | http: | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/image_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/image_utils.py | https://pytorch.org/vision/stable/transforms.html#torchvision.transforms.Resize | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/hf_argparser.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/hf_argparser.py | https://stackoverflow.com/questions/15008758/parsing-boolean-values-with-argparse | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/configuration_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://github.com/huggingface/transformers/issues/14081 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://arxiv.org/abs/1909.05858 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | http://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://gist.github.com/thomwolf/1a5a29f6962089e871b94cbd09daf317 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/hubert/modeling_tf_hubert.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_utils.py | https://github.com/tensorflow/tensorflow/issues/9260 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_tf_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1909.05858.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/tf_logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/pytorch/fairseq/blob/a07cb6f40480928c9e0548b737aadd36ee66ac76/fairseq/sequence_generator.py#L345 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/logits_process.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://github.com/facebookresearch/ParlAI/blob/master/parlai/core/torch_generator_agent.py#L1350 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/models/whisper/modeling_whisper.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_logits_process.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/llm_tutorial.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://huggingface.co/blog/how-to-generate | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/huggingface/transformers/pull/5420/files | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/facebookresearch/XLM/blob/9e6f6814d17be4fe5b15f2e6c43eb2b2d76daeb4/src/model/transformer.py#L529 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/generation/beam_search.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://github.com/ashwinkalyan/dbs/blob/master/dbs/beam_utils.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ja/generation_strategies.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_beam_search.py | https://arxiv.org/pdf/1610.02424.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/pytorch/pytorch/blob/2289a12f21c54da93bf5d696e3f9aea83dd9c10d/torch/testing/_internal/common_cuda.py#L51 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tqdm/tqdm/blob/master/tqdm/autonotebook.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/sentencepiece#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/protocolbuffers/protobuf/tree/master/python#installation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/faiss/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/google/flax | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rspeer/python-ftfy/tree/master#installing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/tapas.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tensorflow/probability | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://hf.co | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/import_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/optuna/optuna/blob/master/optuna/integration/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/doc.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | http://stackoverflow.com/a/6528148/190597 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/feature_extraction_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/integrations/deepspeed.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1394#issuecomment-937405374 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/tests/deepspeed/test_deepspeed.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/1612 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/data/processors/xnli.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/data/processors/xnli.py | https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/run_classifier.py#L207 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/data/processors/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/data/metrics/__init__.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/tapex/run_tabfact_with_tapex.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/data/datasets/glue.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-classification/run_glue.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/ko/model_doc/llama.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/convert_slow_tokenizer.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/convert_graph_to_onnx.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/convert_graph_to_onnx.py | https://github.com/microsoft/onnxruntime/tree/master/onnxruntime/python/tools/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/models | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/utils/hub.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/configuration_utils.py | https://huggingface.co/docs/transformers/installation#offline-mode | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/user.py | https://git-lfs.github.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/commands/lfs.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/lfs.py | https://github.com/git-lfs/git-lfs/blob/master/docs/custom-transfers.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/README.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/add_new_model_like.py | https://huggingface.co/models?filter= | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pythonprofilers/memory_profiler/blob/895c4ac7a08020d66ae001e24067da6dcea42451/memory_profiler.py#L239 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://psutil.readthedocs.io/en/latest/#psutil.Process.memory_info | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/tensorflow/tensorflow/issues/20218#issuecomment-416771802 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark_utils.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_utils.py | https://github.com/pytorch/xla/issues/2180 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_tf.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark_args.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://github.com/NVIDIA/apex/issues/439 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/benchmark/benchmark.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/benchmark/benchmark.py | https://docs.python.org/2/library/timeit.html#timeit.Timer.repeat | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1606.0841 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations_tf.py | https://arxiv.org/abs/1612.08083 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations_tf.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/README_ru.md | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/2004.09602 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1702.03118 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1710.05941v1 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://arxiv.org/abs/1908.08681 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/src/transformers/activations.py | MT5_ID4146_for_PyTorch/transformers/src/transformers/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | MT5_ID4146_for_PyTorch/transformers/setup.py | https://test.pypi.org/legacy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | MT5_ID4146_for_PyTorch/transformers/setup.py | https://testpypi.python.org/pypi | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | MT5_ID4146_for_PyTorch/transformers/setup.py | https://github.com/pypa/pip/issues/5466 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/marian.md | MT5_ID4146_for_PyTorch/transformers/scripts/tatoeba/upload_models.sh | https://huggingface.co/Helsinki-NLP/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/setup.py | MT5_ID4146_for_PyTorch/transformers/scripts/stale.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/pegasus/build_test_sample_spm_no_bos.py | MT5_ID4146_for_PyTorch/transformers/scripts/pegasus/build_test_sample_spm_no_bos.py | https://raw.githubusercontent.com/google/sentencepiece/master/data/botchan.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1914?run_id=6724 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://github.com/pytorch/fairseq/blob/master/examples/wmt19/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docs/source/en/model_doc/fsmt.md | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://arxiv.org/abs/1907.06616 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-ru | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-ru-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-en-de | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://huggingface.co/facebook/wmt19-de-en | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | https://discuss.huggingface.co/t/issues-with-translating-inputs-containing-repeated-phrases/981 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-big | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | https://huggingface.co/allenai/wmt19-de-en-6-6-base | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://www.statmt.org/wmt19/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt19.py | http://matrix.statmt.org/test_sets/newstest2019.tgz?1556572561 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://github.com/jungokasai/deep-shallow/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://arxiv.org/abs/2006.10369 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-dist-6-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://huggingface.co/allenai/wmt16-en-de-12-1 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/TROUBLESHOOT.md | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | https://github.com/huggingface/transformers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://www.statmt.org/wmt16/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-allenai-wmt16.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-allenai-wmt16.py | http://matrix.statmt.org/test_sets/newstest2016.tgz?1504722372 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/eval-facebook-wmt19.sh | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1914?score_id=37605 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/scripts/fsmt/gen-card-facebook-wmt19.py | MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/eval-facebook-wmt19.sh | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/text-classification/run_text_classification.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/seq2seq-distillation/finetune_pegasus_xsum.sh | https://arxiv.org/abs/1912.08777 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/use_own_knowledge_dataset.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/use_own_knowledge_dataset.py | https://huggingface.co/docs/datasets/loading_datasets.html?highlight=csv#csv-files | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag/test_distributed_retriever.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/test_distributed_retriever.py | https://stackoverflow.com/questions/54338013/parallel-import-a-python-file-from-sibling-folder | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://github.com/PyTorchLightning/pytorch-lightning/issues/2424 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/finetune_rag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/rag-end2end-retriever/distributed_ray_retriever.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/distributed_ray_retriever.py | https://docs.ray.io/en/master/walkthrough.html#remote | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/run_quant_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/evaluate-hf-trt-qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/run_mlm_performer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/performer/modeling_flax_performer.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/onnx/summarization/bart_onnx/generation_onnx.py | https://msdata.visualstudio.com/Vienna/_workitems/edit/1486599 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/masked_run_glue.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/arunmallya/piggyback | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/allenai/hidden-networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modules/binarizer.py | https://github.com/NervanaSystems/distiller/blob/2291fdcc2ea642a98d4e20629acb5a9e2e04b4e6/distiller/pruning/automated_gradual_pruner.py#L24 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mm-imdb/run_mmimdb.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_mlm_wwm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/mlm_wwm/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/lxmert/modeling_frcnn.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/modeling_frcnn.py | https://github.com/airsplay/py-bottom-up-attention/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/luke/run_luke_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://yjernite.github.io/lfqa.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://research.google/pubs/pub49029/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://arxiv.org/abs/1907.09190 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/longform-qa/eli5_app.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://huggingface.co/facebook/bart-large | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/longform-qa/eli5_app.py | https://en.wikipedia.org/wiki/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/jax-projects/model_parallel/partitions.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/partitions.py | https://github.com/google-research/google-research/blob/master/flax_models/t5x/partitions.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/run_hybrid_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/dataset-streaming/run_mlm_flax_stream.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/fsner/README.md | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/fsner/src/fsner/model.py | https://arxiv.org/abs/2008.10570 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/fsner/setup.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/utils.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/train.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/run_squad_w_distillation.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/lm_seqs_dataset.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/grouped_batch_sampler.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/grouped_batch_sampler.py | https://github.com/pytorch/vision/blob/master/references/detection/group_by_aspect_ratio.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/facebookresearch/XLM | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/model/net.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/distillation/distiller.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/distillation/distiller.py | https://github.com/peterliht/knowledge-distillation-pytorch/issues/2 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/deebert/test_glue_deebert.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/deebert/test_glue_deebert.py | https://github.com/huggingface/transformers/issues/10560 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/deebert/run_glue_deebert.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/codeparrot/scripts/human_eval.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/codeparrot/scripts/human_eval.py | https://stackoverflow.com/questions/60804599/python-multiprocessing-keeps-spawning-the-whole-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://github.com/huggingface/transformers/blob/783d7d2629e97c5f0c5f9ef01b8c66410275c204/examples/research_projects/bertology/run_bertology.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertology/run_prune_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_prune_gpt.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | http://arxiv.org/abs/1905.10650 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertology/run_bertology.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://github.com/pmichel31415/are-16-heads-really-better-than-1 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertology/run_bertology.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/run_glue_with_pabee.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bert-loses-patience/pabee/modeling_pabee_bert.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://cs.nyu.edu/~kcho/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/abisee/cnn-dailymail/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/utils_summarization.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertabs/utils_summarization.py | https://github.com/nlpyang/PreSumm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/research_projects/bertabs/configuration_bertabs.py | MT5_ID4146_for_PyTorch/transformers/examples/research_projects/bertabs/configuration_bertabs.py | https://huggingface.co/remi/bertabs-finetuned-cnndm-extractive-abstractive-summarization/resolve/main/config.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/xla_spawn.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/translation/run_translation_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/translation/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/token-classification/run_ner.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://github.com/rusiaaman/XLNet-gen#methodology | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/text-generation/run_generation.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-generation/run_generation.py | https://medium.com/@amanrusia/xlnet-speaks-comparison-to-gpt-2-ea1a4e9ba39e | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-classification/run_xnli.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/text-classification/run_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/summarization/run_summarization.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_seq2seq_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa_beam_search.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/multiple-choice/run_swag.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mim.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mim.py | https://github.com/microsoft/SimMIM/blob/main/data/data_simmim.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://arxiv.org/abs/2111.06377 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/pytorch/image-pretraining/run_mae.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/image-pretraining/run_mae.py | https://github.com/facebookresearch/mae/blob/main/main_pretrain.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/vision/run_image_classification.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=v | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/pytorch/contrastive-image-text/run_clip.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.sh | MT5_ID4146_for_PyTorch/transformers/examples/legacy/token-classification/run.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/seq2seq/xla_spawn.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/seq2seq/xla_spawn.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/launch.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_transfo_xl.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://github.com/kimiyoung/transformer-xl/blob/master/pytorch/eval.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_transfo_xl.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_swag.py | https://github.com/google-research/bert/issues/38 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_swag.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_swag.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_swag.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_openai_gpt.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/huggingface/pytorch-openai-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_openai_gpt.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://github.com/openai/finetune-transformer-lm/blob/master/train.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_openai_gpt.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/ymcui/Chinese-BERT-wwm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_chinese_ref.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_chinese_ref.py | https://github.com/HIT-SCIR/ltp | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_camembert.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/run_camembert.py | https://github.com/pytorch/fairseq/blob/master/fairseq/models/roberta/hub_interface.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/run_swag.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/question-answering/run_squad.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.sh | MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.sh | https://drive.google.com/drive/folders/1kC0I2UGl2ltrluI9NqDjaQJGw5iliw_J | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/huggingface/transformers/issues/3159 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/run_ner.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/run_ner.py | https://github.com/PyTorchLightning/pytorch-lightning/blob/master | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/legacy/pytorch-lightning/lightning_base.py | MT5_ID4146_for_PyTorch/transformers/examples/legacy/pytorch-lightning/lightning_base.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/token-classification/run_flax_ner.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/text-classification/run_flax_glue.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/text-classification/run_flax_glue.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.unique | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/summarization/run_summarization_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/question-answering/run_qa.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/question-answering/run_qa.py | https://huggingface.co/transformers/index.html#supported-frameworks | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/t5_tokenizer_model.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/t5_tokenizer_model.py | https://github.com/yandex-research/DeDLOC/blob/main/sahajbert/tokenizer/tokenizer_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/models?filter=t5 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2466 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://arxiv.org/pdf/1910.10683.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/master/t5/data/preprocessors.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_t5_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/models?filter=fill-mask | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_clm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/models?filter=text-generation | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/language-modeling/run_mlm_flax.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://github.com/deepmind/optax/blob/ed02befef9bf81cbbf236be3d2b0e032e9ed4a40/optax/_src/alias.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/run_flax_speech_recognition_seq2seq.py | MT5_ID4146_for_PyTorch/transformers/examples/flax/image-captioning/run_image_captioning_flax.py | https://github.com/google/flax/blob/87a211135c6a377c8f29048a1cac3840e38b9da4/examples/wmt/train.py#L104 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/docker/transformers-pytorch-tpu/Dockerfile | MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://github.com/conda/conda/issues/8385 | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/.circleci/create_circleci_config.py | MT5_ID4146_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://github.com/facebookresearch/detectron2.g | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/speech-recognition/README.md | MT5_ID4146_for_PyTorch/run_translation.py | https://huggingface.co/datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/master/examples/flax/image-captioning/README.md | MT5_ID4146_for_PyTorch/run_translation.py | https://huggingface.co/docs/datasets/loading_datasets.html | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/tests/vit_mae/test_modeling_vit_mae.py | https://discuss.pytorch.org/t/random-seed-that-spans-across-devices/19735 | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/tests/utils/test_utils_check_copies.py | https://huggingface.co/transformers/master/model_doc/albert.ht | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/utils/fx.py | https://github.com/pytorch/pytorch/pull/59569 | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/utils/__init__.py | https://huggingface.co/transformers/installation.html#installing-from-source | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/utils/__init__.py | https://huggingface.co/transformers/examples.html | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://github.com/huggingface/transformers/tree/master/examples/tensorflow | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/testing_utils.py | https://moon-staging.huggingface.co | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/__init__.py | https://github.com/kpu/kenlm/archive/master.zip | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/configuration_wavlm.py | https://huggingface.co/facebook/wavlm-base-960h | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/feature_extraction_vilt.py | https://github.com/dandelin/ViLT/blob/3db8b5035464afee84d951bf6322e1b27f1d072d/vilt/transforms/utils.py#L5 | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/configuration_van.py | https://huggingface.co/van-base | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-960h | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/configuration_unispeech_sat.py | https://huggingface.co/facebook/unispeech_sat-base-960h | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/configuration_unispeech.py | https://huggingface.co/facebook/unispeech-base-960h | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://github.com/rusty1s/pytorch_scatter | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://github.com/huggingface/transformers/blob/master/src/transformers/models/dpr/modeling_tf_dpr.py#L91 | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://github.com/huggingface/transformers/blob/master/src/transformers/modeling_tf_bart.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/configuration_mpnet.py | https://huggingface.co/mpnet-base | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/configuration_imagegpt.py | https://huggingface.co/imagegpt | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/tokenization_flaubert.py | https://github.com/benjaminp/six | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/ctrl/configuration_ctrl.py |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/configuration_ctrl.py | https://huggingface.co/ctrl | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/convert_bert_original_tf2_checkpoint_to_pytorch.py | https://github.com/tensorflow/models/tree/master/official/nlp/bert | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/configuration_beit.py | https://huggingface.co/microsoft/beit-base-patch16-224-in22k | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/generation_flax_utils.py | https://github.com/google/flax/blob/master/examples/wmt/train.py#L254 | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/rusty1s/pytorch_scatter | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://github.com/tensorflow/tensorflow/blob/00fad90125b18b80fe054de1055770cfb8fe4ba3/tensorflow/python/keras/engine/network.py#L1380 | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://huggingface.co/sgugger/my-finetuned-bert | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/data/datasets/language_modeling.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_mlm.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/src/transformers/data/datasets/language_modeling.py | https://github.com/huggingface/transformers/blob/master/examples/pytorch/language-modeling/run_mlm_wwm.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/setup.py | https://github.com/allenai/allennlp/blob/master/setup.py | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/scripts/stale.py | https://github.com/huggingface/transformers/blob/master/CONTRIBUTING.md | 源码实现 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/tensorflow/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/model_parallel/run_clm_mp.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_plm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm_no_trainer.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/pytorch/language-modeling/run_clm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_t5_mlm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_mlm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/flax/language-modeling/run_clm_flax.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torch- | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torch- | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torch- | 模型相关说明 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/flax/vision/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/big_bird/requirements.txt | https://github.com/huggingface/transformers@master | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/research_projects/jax-projects/hybrid_clip/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/requirements.txt | https://github.com/huggingface/transformers.git@352d5472b0c1dec0f420d606d16747d851b4bda8#egg=transformers | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/requirements.txt | https://github.com/huggingface/transformers.git | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/tests/sagemaker/scripts/pytorch/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | -| 开发引入 | / |MT5_ID4146_for_PyTorch/transformers/tests/sagemaker/scripts/tensorflow/requirements.txt | https://github.com/huggingface/transformers.git@master | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml | ci@dummy.com | user.email配置邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/.circleci/config.yml | https://pytorch-geometric.com/whl/torch-1.11.0+cpu.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/CITATION.cff | https://www.aclweb.org/anthology/2020.emnlp-demos.6 | 相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/docker/transformers-all-latest-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://data.pyg.org/whl/torch-$(python -c "from torch import version; print(version.__version__.split(''+'')[0])")+cpu.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/docker/transformers-doc-builder/Dockerfile | https://pypi.ngc.nvidia.com | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-gpu/Dockerfile | https://data.pyg.org/whl/torch-$(python3 -c "from torch import version; print(version.__version__.split(''+'')[0])")+cu102.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/docker/transformers-pytorch-tpu/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-4.7.12-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/fsner/setup.py | msi.sayef@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/lxmert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/movement-pruning/emmental/modeling_bert_masked.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/performer/modeling_flax_performer.py | https://arxiv.org/abs/1607.06450 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/clickbait_classifier_head.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/discriminators/SST_classifier_head.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/technology.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/space.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/science.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/religion.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/politics.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/military.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/pplm/run_pplm.py | https://s3.amazonaws.com/models.huggingface.co/bert/pplm/bow/legal.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/quantization-qdqbert/Dockerfile | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/rag-end2end-retriever/finetune_rag.py | https://docs.ray.io/en/master/cluster/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/examples/research_projects/visual_bert/utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/convert-facebook-wmt19.sh | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1902?run_id=6750 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1909?run_id=6862 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1914?run_id=6724 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/scripts/fsmt/gen-card-facebook-wmt19.py | http://matrix.statmt.org/matrix/output/1907?run_id=6937 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/commands/convert.py | https://www.tensorflow.org/install/ | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://s3.amazonaws.com/models.huggingface.co/bert | 模型地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://pytorch.org/get-started/locally/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://pandas.pydata.org/pandas-docs/stable/getting_started/install.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://www.tensorflow.org/install | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/file_utils.py | https://pypi.ngc.nvidia.com | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/integrations.py | https://app.sigopt.com/experiment/{experiment.id} | experiment地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_flax_pytorch_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_tf_pytorch_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/modeling_utils.py | https://flax.readthedocs.io/en/latest/installation.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_flax_albert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/albert/modeling_tf_albert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_bart.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_flax_bart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bart/modeling_tf_bart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/convert_beit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_beit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/beit/modeling_flax_beit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_flax_bert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert/modeling_tf_bert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bert_generation/modeling_bert_generation.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_big_bird.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/big_bird/modeling_flax_big_bird.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/bigbird_pegasus/modeling_bigbird_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_blenderbot.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_flax_blenderbot.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot/modeling_tf_blenderbot.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_blenderbot_small.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_flax_blenderbot_small.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/blenderbot_small/modeling_tf_blenderbot_small.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/modeling_camembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/camembert/modeling_tf_camembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/canine/modeling_canine.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_clip.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_flax_clip.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/clip/modeling_tf_clip.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_convbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convbert/modeling_tf_convbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_226.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/convert_convnext_to_pytorch.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_convnext.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/convnext/modeling_tf_convnext.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/modeling_tf_ctrl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ctrl/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_audio.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/data2vec/modeling_data2vec_text.py | https://arxiv.org/pdf/2202.03555 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta/modeling_tf_deberta.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deberta_v2/modeling_tf_deberta_v2.py | https://arxiv.org/abs/2006.03654 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/convert_deit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/deit/modeling_deit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/convert_detr_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/detr/modeling_detr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_distilbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_flax_distilbert.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/distilbert/modeling_tf_distilbert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dit/convert_dit_unilm_to_pytorch.py | https://layoutlm.blob.core.windows.net/dit/dit-pts/dit-base-224-p16-500k-62d53a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/modeling_dpr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/dpr/modeling_tf_dpr.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_electra.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_flax_electra.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/electra/modeling_tf_electra.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_flax_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/encoder_decoder/modeling_tf_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_flaubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/flaubert/modeling_tf_flaubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fnet/modeling_fnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/fsmt/modeling_fsmt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/funnel/modeling_tf_funnel.py | https://arxiv.org/abs/2006.03236 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_flax_gpt_neo.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt_neo/modeling_gpt_neo.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_flax_gpt2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gpt2/modeling_tf_gpt2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_flax_gptj.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/gptj/modeling_gptj.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_hubert.py | https://arxiv.org/abs/2106.07449 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/hubert/modeling_tf_hubert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/ibert/modeling_ibert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/imagegpt/modeling_imagegpt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_layoutlm.py | https://arxiv.org/abs/1912.13318 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlm/modeling_tf_layoutlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/layoutlmv2/modeling_layoutlmv2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_led.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/led/modeling_tf_led.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/longformer/modeling_tf_longformer.py | https://arxiv.org/abs/2004.05150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/luke/modeling_luke.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/lxmert/modeling_tf_lxmert.py | https://arxiv.org/abs/1908.07490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/m2m_100/modeling_m2m_100.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/convert_marian_tatoeba_to_pytorch.py | https://datahub.io/core/language-codes/r/language-codes-3b2.csv | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_flax_marian.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_marian.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/marian/modeling_tf_marian.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/convert_maskformer_original_pytorch_checkpoint_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/maskformer/modeling_maskformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_flax_mbart.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_mbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mbart/modeling_tf_mbart.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/megatron_bert/modeling_megatron_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mmbt/modeling_mmbt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_mobilebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mobilebert/modeling_tf_mobilebert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_mpnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/mpnet/modeling_tf_mpnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/nystromformer/modeling_nystromformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/openai/modeling_tf_openai.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_flax_pegasus.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_pegasus.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/pegasus/modeling_tf_pegasus.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/convert_perceiver_haiku_to_pytorch.py | https://storage.googleapis.com/perceiver_io/dalmation.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/perceiver/modeling_perceiver.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/plbart/modeling_plbart.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/convert_poolformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/poolformer/modeling_poolformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/prophetnet/modeling_prophetnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/qdqbert/modeling_qdqbert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_rag.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/modeling_tf_rag.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rag/retrieval_rag.py | https://storage.googleapis.com/huggingface-nlp/datasets/wiki_dpr/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/realm/modeling_realm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/reformer/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_rembert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/rembert/modeling_tf_rembert.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/resnet/modeling_resnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/retribert/modeling_retribert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_flax_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roberta/modeling_tf_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_flax_roformer.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_roformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/roformer/modeling_tf_roformer.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/convert_segformer_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/segformer/modeling_segformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew/modeling_sew.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/sew_d/modeling_sew_d.py | https://arxiv.org/abs/2109.06870 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_flax_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_encoder_decoder/modeling_speech_encoder_decoder.py | https://arxiv.org/abs/2104.06678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_speech_to_text.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text/modeling_tf_speech_to_text.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/speech_to_text_2/modeling_speech_to_text_2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/splinter/modeling_splinter.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/squeezebert/modeling_squeezebert.py | https://arxiv.org/abs/2006.11316 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/convert_swin_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/swin/modeling_swin.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_flax_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/t5/modeling_tf_t5.py | https://arxiv.org/abs/1910.10683 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tapas.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/tapas/modeling_tf_tapas.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_tf_transfo_xl.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/transfo_xl/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/trocr/modeling_trocr.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech/modeling_unispeech.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/unispeech_sat/modeling_unispeech_sat.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/van/modeling_van.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/convert_vilt_original_to_pytorch.py | https://lil.nlp.cornell.edu/nlvr/exs/ex0_0.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vilt/modeling_vilt.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_flax_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_tf_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/1907.12461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_encoder_decoder/modeling_vision_encoder_decoder.py | https://arxiv.org/abs/2109.10282 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_flax_vision_text_dual_encoder.py | https://farm3.staticflickr.com/2674/5850229113_4fe05d5265_z.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vision_text_dual_encoder/modeling_vision_text_dual_encoder.py | https://arxiv.org/abs/2111.07991 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/visual_bert/modeling_visual_bert.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/convert_dino_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/convert_vit_timm_to_pytorch.py | http://images.cocodataset.org/val2017/000000039769.jpg | 图片地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_flax_vit.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_tf_vit.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit/modeling_vit.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://dl.fbaipublicfiles.com/mae/visualize/mae_visualize_vit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/convert_vit_mae_to_pytorch.py | https://user-images.githubusercontent.com/11435359/147738734-196fd92f-9260-48d5-ba7e-bf103d29364d.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/vit_mae/modeling_vit_mae.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_flax_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_tf_wav2vec2.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wav2vec2/modeling_wav2vec2.py | https://arxiv.org/abs/2006.11477 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/wavlm/modeling_wavlm.py | https://arxiv.org/abs/2101.07597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_flax_xglm.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xglm/modeling_xglm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/modeling_tf_xlm.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_flax_xlm_roberta.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_tf_xlm_roberta.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta/modeling_xlm_roberta.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlm_roberta_xl/modeling_xlm_roberta_xl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_tf_xlnet.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/xlnet/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/models/yoso/modeling_yoso.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/pipelines/base.py | https://pytorch.org/ | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/src/transformers/trainer_tf.py | https://docs.wandb.com/huggingface | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/flax.linen.html#module | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://flax.readthedocs.io/en/latest/_autosummary/flax.nn.module.html | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_flax_{{cookiecutter.lowercase_modelname}}.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/templates/adding_a_new_model/cookiecutter-template-{{cookiecutter.modelname}}/modeling_tf_{{cookiecutter.lowercase_modelname}}.py | https://www.tensorflow.org/api_docs/python/tf/keras/Model | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com | glue数据集diagnostic链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_train.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://dl.fbaipublicfiles.com/senteval/senteval_data/msr_paraphrase_test.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FWNLI.zip?alt=media&token=068ad0a0-ded7-4bd7-99a5-5e00222e0faf | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSTS-B.zip?alt=media&token=bddb94a7-8706-4e0d-a694-1109e12273b5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSST-2.zip?alt=media&token=aabc5f6b-e466-44a2-b9b4-cf6337f84ac8 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FSNLI.zip?alt=media&token=4afcfbb2-ff0c-4b2d-a09a-dbf07926f4df | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FRTE.zip?alt=media&token=5efa7e85-a0bb-4f19-8ea2-9e1840f077fb | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQQP.zip?alt=media&token=700c6acf-160d-4d89-81d1-de4191d02cb5 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FQNLIv2.zip?alt=media&token=6fdcf570-0fc5-4631-8456-9505272d1601 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FMNLI.zip?alt=media&token=50329ea1-e339-40e2-809c-10c40afff3ce | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2FCoLA.zip?alt=media&token=46d5e637-3411-4188-bc44-5809b5bfb5f4 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2Fmrpc_dev_ids.tsv?alt=media&token=ec5c0836-31d5-48f4-b431-7480817f1adc | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/transformers/utils/download_glue_data.py | https://storage.googleapis.com/mtl-sentence-representations.appspot.com/tsvsWithoutLabels%2FAX.tsv?GoogleAccessId=firebase-adminsdk-0khhl@mtl-sentence-representations.iam.gserviceaccount.com&Expires=2498860800&Signature=DuQ2CSPt2Yfre0C%2BiISrVYrIFaZH1Lc7hBVZDD4ZyR7fZYOMNOUGpi8QxBmTNOrNPjR3z1cggo7WXFfrgECP6FBJSsURv8Ybrue8Ypt%2FTPxbuJ0Xc2FhDi%2BarnecCBFO77RSbfuz%2Bs95hRrYhTnByqu3U%2FYZPaj3tZt5QdfpH2IUROY8LiBXoXS46LE%2FgOQc%2FKN%2BA9SoscRDYsnxHfG0IjXGwHN%2Bf88q6hOmAxeNPx6moDulUF6XMUAaXCSFU%2BnRO2RDL9CapWxj%2BDl7syNyHhB7987hZ80B%2FwFkQ3MEs8auvt5XW1%2Bd4aCU7ytgM69r8JDCwibfhZxpaa4gd50QXQ%3D%3D | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/url.ini | thomas@huggingface.co | 邮箱地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/url.ini | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/url.ini | https://www.researchgate.net/profile/Dinh-Sang/publication/338099565/figure/fig8/AS:840413229350922@1577381536857/An-receipt-example-in-the-SROIE-2019-dataset_Q640.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/MT5_ID4146_for_PyTorch/url.ini | https://layoutlm.blob.core.windows.net/trocr/model_zoo/fairseq/trocr-base-handwritten.pt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/ReFormer_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/ReFormer_for_PyTorch/public_address_statement.md index 073530f8085b34425f756f071dc809ffc469f76f..5dd447720ba15fc02c981e6203ae4e7b0c54d5c2 100644 --- a/PyTorch/built-in/nlp/ReFormer_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/ReFormer_for_PyTorch/public_address_statement.md @@ -1,56 +1,6 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:-------------------------:|:---------------------------------------------------------------------------------------------:|:--------------------:|:-----------------:| -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/models?filter=fill-mask | checkpoints列表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/datasets/ | 公共数据集地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/docs/datasets/loading_datasets.html. | 加载数据集的指导 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.25.1/examples/pytorch/language-modeling/run_mlm.py | ./run_mlm.py | https://huggingface.co/docs/datasets/package_reference/main_classes.html#datasets.Dataset.map | multiprocessing的map方法指导 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer_pt_utils.py | ./transformers_modify/trainer_pt_utils.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer_pt_utils.py | ./transformers_modify/trainer_pt_utils.py | https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst | numpy1.21.4不支持bf16的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer_pt_utils.py | ./transformers_modify/trainer_pt_utils.py | https://github.com/pytorch/pytorch/issues/16266 | pytorch存在的issue | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://huggingface.co/docs/transformers/model_doc/auto | 适合训练的模型列表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/huggingface/peft | 使用peft库适配器的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://www.github.com/nvidia/apex | 安装APEX的教程 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/pytorch/torchdistx | 安装torchdistx的链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/intel/intel-extension-for-pytorch | 安装IPEX的教程 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021 | find_unused_parameters的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://github.com/pytorch/pytorch/issues/82963 | FSDP 错误的解决方法 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html | optuna.create_study的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run | tune.run的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/trainer.py | ./transformers_modify/trainer.py | https://app.sigopt.com/docs/endpoints/experiments/create | sigopt的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/pytorch/xla/pull/3609 | torchrun支撑文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/optimum-neuron | 使用TrainiumTrainer的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://docs.python.org/3/library/argparse#module-argparse | 使用argparse的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | 脚本参数说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://www.tensorflow.org/tensorboard | TensorBoard使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://huggingface.co/docs/safetensors | safetensor使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/intel/intel-extension-for-pytorch | IPEX安装说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://nvidia.github.io/apex/amp | Apex使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://huggingface.co/docs/transformers/performance#tf32 | TF32模式使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://www.wandb.com/ | wandb官网 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://www.mlflow.org/ | mlflow官网 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/facebookresearch/fairscale | FairScale使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/pytorch/xla/blob/master/torch_xla/distributed/fsdp/xla_fully_sharded_data_parallel.py | xla选项说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/microsoft/deepspeed | deepspeed使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/transformers/tree/main/examples | transformers训练示例脚本 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html#ray.tune.ExperimentAnalysis.get_best_trial | Ray说明文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_group | torch.distributed.init_process_group说明文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://pytorch.org/get-started/pytorch-2.0/ | torch.compile说明文档 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://pytorch.org/docs/2.0/generated/torch.compile.html?highlight=torch+compile#torch.compile | torch.compile最好的默认配置 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://nvidia.github.io/apex/amp.html | AMP optimization level说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/transformers/issues/10628 | 扩展output_dir的说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/huggingface/safetensors! | Safetensors使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/training_args.py | ./transformers_modify/training_args.py | https://github.com/pytorch/pytorch/issues/82707 | 基于transformer的models的评价指标 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | http://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://huggingface.co/models?filter=reformer | reformer模型列表 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://arxiv.org/pdf/1509.02897.pdf | Locality-Sensitive Hashing论文 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://arxiv.org/pdf/2001.04451.pdf | reformer模型论文 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://towardsdatascience.com/illustrating-the-reformer-393575ac6ba0 | RevNet模型应用 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://github.com/lucidrains/reformer-pytorch/blob/master/reformer_pytorch/reversible.py | reformer代码灵感来源 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://github.com/pytorch/pytorch/pull/5617 | truncated_normal使用说明 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.28.1/src/transformers/models/reformer/modeling_reformer.py | ./transformers_modify/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | torch.nn文档 | -| 开源代码引入 | https://github.com/huggingface/datasets/blob/main/metrics/accuracy/accuracy.py | accuracy.py | https://scikit-learn.org/stable/modules/generated/sklearn.metrics.accuracy_score.html | accuracy计算方式参考 - | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/ReFormer_for_PyTorch/accuracy.py | https://scikit-learn.org/stable/modules/generated/sklearn.metrics.accuracy_score.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/ReFormer_for_PyTorch/transformers_modify/modeling_reformer.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 源码实现 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/ReFormer_for_PyTorch/transformers_modify/modeling_reformer.py | https://arxiv.org/abs/2001.04451 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/ReFormer_for_PyTorch/transformers_modify/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/public_address_statement.md b/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/public_address_statement.md index c64f780749144c148d471143bbb5e3b2e0dc55a9..4e9391a9cd85b23575a52045f5d04fdfe71fe7d0 100644 --- a/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/public_address_statement.md @@ -1,218 +1,94 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/setup.py|Scaling-nmt_for_Pytorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/setup.py|Scaling-nmt_for_Pytorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/setup.py|Scaling-nmt_for_Pytorch/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/.github/ISSUE_TEMPLATE.md|Scaling-nmt_for_Pytorch/setup.py | https://github.com/pytorch/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/data/audio/feature_transforms/specaugment.py|Scaling-nmt_for_Pytorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_dlm/modules/speech_dlm_decoder.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_dlm/modules/speech_dlm_decoder_layer.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/modules/convolution.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/modules/convolution.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/modules/emformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/modules/convolution.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/modules/convolution.py | https://github.com/espnet/espnet | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/modules/augmented_memory_attention.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.08042 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/modules/augmented_memory_attention.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.09137 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/modules/emformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/2005.09684 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_dlm/sequence_generator/multichannel_sequence_generator.py | https://discuss.pytorch.org/t/how-to-mask-and-assign-a-value-to-tensor/18437 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/conf.py|Scaling-nmt_for_Pytorch/docs/conf.py | https://github.com/pytorch/fairseq/tree/main/docs/ | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/conf.py|Scaling-nmt_for_Pytorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/conf.py|Scaling-nmt_for_Pytorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/conf.py|Scaling-nmt_for_Pytorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/file_utils.py|Scaling-nmt_for_Pytorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/bart/README.md|Scaling-nmt_for_Pytorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/checkpoint_utils.py|Scaling-nmt_for_Pytorch/fairseq/checkpoint_utils.py | https://pypi.org/project/huggingface-hub/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/layerdrop/README.md|Scaling-nmt_for_Pytorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/constrained_decoding/README.md|Scaling-nmt_for_Pytorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/constrained_decoding/README.md|Scaling-nmt_for_Pytorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py|Scaling-nmt_for_Pytorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/search.py|Scaling-nmt_for_Pytorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/trainer.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/trainer.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wav2vec/unsupervised/w2vu_generate.py|Scaling-nmt_for_Pytorch/fairseq_cli/hydra_train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wav2vec/unsupervised/w2vu_generate.py|Scaling-nmt_for_Pytorch/fairseq_cli/hydra_validate.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wav2vec/unsupervised/w2vu_generate.py|Scaling-nmt_for_Pytorch/fairseq_cli/train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/scripts/build_sym_alignment.py|Scaling-nmt_for_Pytorch/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/scripts/build_sym_alignment.py|Scaling-nmt_for_Pytorch/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/scripts/build_sym_alignment.py|Scaling-nmt_for_Pytorch/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/data/indexed_dataset.py|Scaling-nmt_for_Pytorch/tests/test_token_block_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/criterions/adaptive_loss.py|Scaling-nmt_for_Pytorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/criterions/speech_dlm_criterion.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/criterions/tacotron2_loss.py|Scaling-nmt_for_Pytorch/fairseq/criterions/tacotron2_loss.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wav2vec/README.md|Scaling-nmt_for_Pytorch/fairseq/data/data_utils.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/data/indexed_dataset.py|Scaling-nmt_for_Pytorch/fairseq/data/indexed_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wav2vec/README.md|Scaling-nmt_for_Pytorch/fairseq/data/mask_tokens_dataset.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/data/speech_dlm_dataset.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/data/span_mask_tokens_dataset.py|Scaling-nmt_for_Pytorch/fairseq/data/span_mask_tokens_dataset.py | https://github.com/google-research/text-to-text-transfer-transformer/blob/84f8bcc14b5f2c03de51bd3587609ba8f6bbd1cd/t5/data/preprocessors.py#L2682 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/dataclass/constants.py|Scaling-nmt_for_Pytorch/fairseq/dataclass/constants.py | https://github.com/facebookresearch/hydra/issues/1156 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/dataclass/configs.py|Scaling-nmt_for_Pytorch/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/dataclass/utils.py|Scaling-nmt_for_Pytorch/fairseq/dataclass/utils.py | https://github.com/omry/omegaconf/pull/911 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/dataclass/configs.py|Scaling-nmt_for_Pytorch/fairseq/dataclass/configs.py | https://github.com/facebookresearch/hydra/issues/1117 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/fairseq_incremental_decoder.py|Scaling-nmt_for_Pytorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/conv_seq2seq/README.md|Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/getting_started.rst|Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/conv_seq2seq/README.md|Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/conv_seq2seq/README.md|Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/fconv_self_att.py|Scaling-nmt_for_Pytorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/fconv_self_att.py|Scaling-nmt_for_Pytorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/stories/README.md|Scaling-nmt_for_Pytorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/lightconv.py|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pay_less_attention_paper/README.md|Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/language_model/README.adaptive_inputs.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/language_model/README.adaptive_inputs.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer_lm.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/criterions/adaptive_loss.py|Scaling-nmt_for_Pytorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/character_token_embedder.py|Scaling-nmt_for_Pytorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_conformer.py|Scaling-nmt_for_Pytorch/fairseq/modules/conformer_layer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/nonautoregressive_translation/README.md|Scaling-nmt_for_Pytorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/dynamic_crf_layer.py|Scaling-nmt_for_Pytorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/truncated_bptt/README.md|Scaling-nmt_for_Pytorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/truncated_bptt/README.md|Scaling-nmt_for_Pytorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/gelu.py|Scaling-nmt_for_Pytorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/truncated_bptt/README.md|Scaling-nmt_for_Pytorch/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/layerdrop/README.md|Scaling-nmt_for_Pytorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/location_attention.py|Scaling-nmt_for_Pytorch/fairseq/modules/location_attention.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/lstm_cell_with_zoneout.py|Scaling-nmt_for_Pytorch/fairseq/modules/lstm_cell_with_zoneout.py | https://arxiv.org/abs/1606.01305 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/rotary_positional_embedding.py|Scaling-nmt_for_Pytorch/fairseq/modules/rotary_positional_embedding.py | https://blog.eleuther.ai/rotary-embeddings/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/truncated_bptt/README.md|Scaling-nmt_for_Pytorch/fairseq/modules/positional_encoding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/rotary_positional_embedding.py|Scaling-nmt_for_Pytorch/fairseq/modules/rotary_positional_embedding.py | https://arxiv.org/pdf/2104.09864.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/sparse_multihead_attention.py|Scaling-nmt_for_Pytorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/modules/vggblock.py|Scaling-nmt_for_Pytorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adafactor.py|Scaling-nmt_for_Pytorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adam.py|Scaling-nmt_for_Pytorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adam.py|Scaling-nmt_for_Pytorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adam.py|Scaling-nmt_for_Pytorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adam.py|Scaling-nmt_for_Pytorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/bmuf.py|Scaling-nmt_for_Pytorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adam.py|Scaling-nmt_for_Pytorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/adam.py|Scaling-nmt_for_Pytorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/scaling_nmt/README.md|Scaling-nmt_for_Pytorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/cross_lingual_language_model/README.md|Scaling-nmt_for_Pytorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/tasks/fairseq_task.py|Scaling-nmt_for_Pytorch/fairseq/tasks/fairseq_task.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/tasks/fairseq_task.py|Scaling-nmt_for_Pytorch/fairseq/tasks/fairseq_task.py | https://github.com/facebookresearch/GENRE | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/tasks/speech_dlm_task.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/nonautoregressive_translation/README.md|Scaling-nmt_for_Pytorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_synthesis/utils.py|Scaling-nmt_for_Pytorch/fairseq/tasks/text_to_speech.py | https://arxiv.org/pdf/2011.03568.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/tasks/translation_lev.py|Scaling-nmt_for_Pytorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/hydra_plugins/dependency_submitit_launcher/setup.py|Scaling-nmt_for_Pytorch/hydra_plugins/dependency_submitit_launcher/setup.py | abaevski@fb.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/getting_started.rst|Scaling-nmt_for_Pytorch/tests/speech/test_convtransformer_simul_trans.py | https://dl.fbaipublicfiles.com/fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_text_joint_to_text/docs/pre-training.md|Scaling-nmt_for_Pytorch/tests/speech/test_dual_input_wav_transformer.py | https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/acl2022/librispeech/finetuned | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/getting_started.rst|Scaling-nmt_for_Pytorch/tests/speech/test_s2s_transformer.py | https://dl.fbaipublicfiles.com/fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/getting_started.rst|Scaling-nmt_for_Pytorch/tests/speech/test_wav2vec2.py | https://dl.fbaipublicfiles.com/fairseq | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/docs/getting_started.rst|Scaling-nmt_for_Pytorch/tests/speech/__init__.py | https://dl.fbaipublicfiles.com/fairseq | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_text_joint_to_text/docs/ende-mustc.md|Scaling-nmt_for_Pytorch/tests/speech/__init__.py | https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/must_c/en_de | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/tests/speech_recognition/asr_test_base.py|Scaling-nmt_for_Pytorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/tests/speech_recognition/asr_test_base.py|Scaling-nmt_for_Pytorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/clib/libbase/balanced_assignment.cpp|Scaling-nmt_for_Pytorch/fairseq/clib/libbase/balanced_assignment.cpp | https://dspace.mit.edu/bitstream/handle/1721.1/3265/P-2108-26912652.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/clib/libbase/balanced_assignment.cpp|Scaling-nmt_for_Pytorch/fairseq/clib/libbase/balanced_assignment.cpp | https://github.com/bkj/auction-lap | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/clib/libnat_cuda/binding.cpp|Scaling-nmt_for_Pytorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/bart/README.summarization.md|Scaling-nmt_for_Pytorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/bart/README.summarization.md|Scaling-nmt_for_Pytorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/data/encoders/gpt2_bpe_utils.py|Scaling-nmt_for_Pytorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/data/audio/speech_to_text_dataset.py|Scaling-nmt_for_Pytorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/bart/hub_interface.py|Scaling-nmt_for_Pytorch/fairseq/models/bart/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/bart | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/bart/model.py|Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/bart/model.py|Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/bart/model.py|Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/bart/model.py|Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/bart/model.py|Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/ema/ema.py|Scaling-nmt_for_Pytorch/fairseq/models/ema/ema.py | https://github.com/zhawe01/fairseq-gec.git | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/ema/ema.py|Scaling-nmt_for_Pytorch/fairseq/models/ema/ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/layerdrop/README.md|Scaling-nmt_for_Pytorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/roberta | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/hub_interface.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | https://openreview.net/forum?id=_CMSV7FTzGI | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/gottbert/README.md|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_xlmr.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_xlmr.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_xlmr.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_camembert.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/roberta/model_xlmr.py|Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/textless_nlp/dgslm/hubert_fisher/README.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_dlm/speech_dlm.py | https://arxiv.org/pdf/2203.16502.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/convtransformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/convtransformer.py | https://arxiv.org/abs/2004.10234 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_to_speech/docs/direct_s2st_discrete_units.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_speech/s2s_transformer.py | https://arxiv.org/abs/2107.05604 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_conformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/s2t_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_transformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_to_text/README.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/berard.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_to_text/README.md|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_transformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/xm_transformer_unity.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_transformer.py|Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/speech_synthesis/docs/ljspeech_example.md|Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/fastspeech2.py | https://arxiv.org/abs/2006.04558 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_transformer.py|Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/fastspeech2.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_base.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/text_to_speech/tacotron2.py|Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/tacotron2.py | https://arxiv.org/pdf/1712.05884.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/text_to_speech/vocoder.py|Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/vocoder.py | http://dl.fbaipublicfiles.com/fairseq/vocoder | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/text_to_speech/tts_transformer.py|Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/speech_to_text/s2t_transformer.py|Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/scaling_nmt/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/scaling_nmt/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/backtranslation/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/translation/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/translation/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/translation/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/translation/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer/transformer_legacy.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer/transformer_legacy.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer/transformer_legacy.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/transformer/transformer_legacy.py|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wmt20/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wmt20/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wmt20/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wmt20/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wmt20/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/wmt20/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/models/wav2vec/utils.py|Scaling-nmt_for_Pytorch/fairseq/models/wav2vec/utils.py | https://github.com/lucidrains/local-attention/blob/master/local_attention/local_attention.py#L41 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/flores101/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/flores101/README.md|Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/xmod/README.md|Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/megatron_11b/README.md|Scaling-nmt_for_Pytorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/megatron_11b/README.md|Scaling-nmt_for_Pytorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/examples/megatron_11b/README.md|Scaling-nmt_for_Pytorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py|Scaling-nmt_for_Pytorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py|Scaling-nmt_for_Pytorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/bolb/main/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py|Scaling-nmt_for_Pytorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/speech_to_text/xm_transformer_unity.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/fastspeech2.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/text_to_speech/vocoder.py | http://dl.fbaipublicfiles.com/fairseq/vocoder | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/hydra_plugins/dependency_submitit_launcher/setup.py | abaevski@fb.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Scaling-nmt_for_Pytorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/public_address_statement.md index d3e1a77aadfd4d1b9f51cbe715d8600c798c8bb2..cc4b315d05c64364db7ad818e24b03fc22b1f9a0 100644 --- a/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/public_address_statement.md @@ -1,41 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 |用途说明| -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/requirements.txt |Transformer_ID0105_for_PyTorch/requirements.txt |https://github.com/NVIDIA/dllogger.git|requirements.txt中dllogger在开源社区中的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-iwslt14.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-iwslt14.sh|https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-iwslt14.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-iwslt14.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-iwslt14.sh |https://wit3.fbk.eu/archive/2014-01/texts/de/en/de-en.tgz|iwslt14数据集开源社区tgz下载链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz|wmt17数据集training-parallel-nc-v12在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://data.statmt.org/wmt17/translation-task/dev.tgz|wmt17数据集dev在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt14/test-full.tgz|wmt14数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh |http://statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14数据集training-parallel-nc-v9在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |https://github.com/rsennrich/subword-nmt.git|subword-nmt在开源社区的git链接配置| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-europarl-v7.tgz|wmt13数据集training-parallel-europarl-v7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-commoncrawl.tgz|wmt13数据集training-parallel-commoncrawlv7在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt13/training-parallel-un.tgz|wmt13数据集training-parallel-un在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt14/training-parallel-nc-v9.tgz|wmt14数据集training-parallel-nc-v9在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt10/training-giga-fren.tar|wmt10数据集training-giga-fren在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh |http://statmt.org/wmt14/test-full.tgz|wmt14数据集test-full在开源社区的tgz下载链接| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2fr.sh |Transformer_ID0105_for_PyTorch/optim/cpex/amp/handle.py |https://nvidia.github.io/apex/amp.html#transition-guide-for-old-api-users|Amp API版本过旧时异常提示| -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/wrap.py | https://github.com/pytorch/pytorch/pull/157 | 源码实现 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/handle.py | https://nvidia.github.io/apex/advanced.ht | 模型相关说明 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/frontend.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet#distributed-traini | 源码实现 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Detection/Efficientdet/train.py|Transformer_ID0105_for_PyTorch/optim/cpex/amp/frontend.py | https://github.com/NVIDIA/apex/tree/master/examples/imagen | 源码实现 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/frontend.py | https://nvidia.github.io/apex/advanced.ht | 模型相关说明 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/frontend.py | https://nvidia.github.io/apex/advanced.html#multiple-models-optimizers-loss | 模型相关说明 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/frontend.py | https://github.com/NVIDIA/apex/issu | 源码实现 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/amp/_amp_state.py | http://effbot.org/pyfaq/how-do-i-share-global-variables-across-modules.h | 模型相关说明 | -| 开发引入 | / | Transformer_ID0105_for_PyTorch/optim/cpex/__init__.py | https://pytorch.org/cppdocs/notes/faq.html#undefined-symbol-errors-from-pytorch-at | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/BART/utils/optimization.py|Transformer_ID0105_for_PyTorch/optim/adam.py | https://arxiv.org/abs/1711.051 | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py|Transformer_ID0105_for_PyTorch/optim/adam.py | https://arxiv.org/abs/1412.69 | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py|Transformer_ID0105_for_PyTorch/optim/adam.py | https://openreview.net/forum?id=ryQu7f- | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-iwslt14.sh|Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData. | 源码实现 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-iwslt14.sh|Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData. | 源码实现 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh|Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://arxiv.org/abs/1705.031 | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-wmt14en2de.sh|Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | https://arxiv.org/abs/1806.001 | 模型相关说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/Translation/Transformer/examples/translation/prepare-iwslt14.sh|Transformer_ID0105_for_PyTorch/examples/translation/prepare-iwslt14.sh | https://github.com/facebookresearch/MIXER/blob/master/prepareData. | 源码实现 | -| 开发引入 | / |Transformer_ID0105_for_PyTorch/requirements.txt | https://github.com/NVIDIA/dllogger.git | 相关依赖 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/Transformer_ID0105_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/public_address_statement.md index 140792938c3f872ca5fc640dd69771d48c8e463f..034f4978801e6536df14593267a50afc93c3aa8e 100644 --- a/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/public_address_statement.md @@ -1,96 +1,22 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 |用途说明| -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-glue.sh|XLM_ID0740_for_PyTorch/get-data-glue.sh |https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2F|mtl-sentence-representations在开源社区中的url链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.en.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.en.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.fr.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.fr.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.fr.shuffled.gz|wmt14数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://data.statmt.org/wmt16/translation-task/news.2015.ro.shuffled.gz|wmt16数据集在开源社区中的shuffled.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh|XLM_ID0740_for_PyTorch/get-data-nmt.sh |http://data.statmt.org/wmt18/translation-task/dev.tgz|wmt18数据集在开源社区中的dev.gz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=MultiUN%2Far-en.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=EUbookshop%2Fbg-en.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=Europarl%2Fbg-en.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=EUbookshop%2Fde-en.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=EUbookshop%2Fel-en.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |https://object.pouta.csc.fi/OPUS-MultiUN/v1/moses/en-es.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |https://object.pouta.csc.fi/OPUS-MultiUN/v1/moses/en-fr.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://www.cfilt.iitb.ac.in/iitb_parallel/iitb_corpus_download/parallel.tgz|语言转换包在开源社区中的tgz下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=MultiUN%2Fen-ru.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=Tanzil%2Fen-sw.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=GlobalVoices%2Fen-sw.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-th.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=SETIMES2%2Fen-tr.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=Wikipedia%2Fen-tr.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |https://object.pouta.csc.fi/OPUS-TED2013/v1.1/moses/en-tr.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=Tanzil%2Fen-ur.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-vi.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh|XLM_ID0740_for_PyTorch/get-data-para.sh |http://opus.nlpl.eu/download.php?f=MultiUN%2Fen-zh.txt.zip|语言转换包在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-wiki.sh|XLM_ID0740_for_PyTorch/get-data-wiki.sh |https://dumps.wikimedia.org/${lg}wiki/latest/$WIKI_DUMP_NAME|wiki开源社区引用链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-xnli.sh|XLM_ID0740_for_PyTorch/get-data-xnli.sh |https://dl.fbaipublicfiles.com/XNLI/XNLI-MT-1.0.zip|XNLI-MT-1.0在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-xnli.sh|XLM_ID0740_for_PyTorch/get-data-xnli.sh |https://dl.fbaipublicfiles.com/XNLI/XNLI-1.0.zip|XNLI-1.0在开源社区中的zip下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/install-tools.sh|XLM_ID0740_for_PyTorch/install-tools.sh |https://github.com/moses-smt/mosesdecoder.git|mosesdecoder在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/install-tools.sh|XLM_ID0740_for_PyTorch/install-tools.sh |https://github.com/glample/fastBPE|fastBPE在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/install-tools.sh|XLM_ID0740_for_PyTorch/install-tools.sh |https://github.com/rsennrich/wmt16-scripts.git|wmt16-scripts在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/install-tools.sh|XLM_ID0740_for_PyTorch/install-tools.sh |https://github.com/attardi/wikiextractor.git|wikiextractor在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/README.md|XLM_ID0740_for_PyTorch/install-tools.sh |https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip|stanford-segmenter在开源社区中的git链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/prepare-glue.sh|XLM_ID0740_for_PyTorch/prepare-glue.sh |https://dl.fbaipublicfiles.com/XLM/codes_en|codes_en在开源社区中的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/prepare-glue.sh|XLM_ID0740_for_PyTorch/prepare-glue.sh |https://dl.fbaipublicfiles.com/XLM/vocab_en|vocab_en在开源社区中的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/prepare-xnli.sh|XLM_ID0740_for_PyTorch/prepare-xnli.sh |https://dl.fbaipublicfiles.com/XLM/codes_xnli_15|codes_xnli_15在开源社区中的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/prepare-xnli.sh|XLM_ID0740_for_PyTorch/prepare-xnli.sh |https://dl.fbaipublicfiles.com/XLM/vocab_xnli_15|vocab_xnli_15在开源社区中的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/setup.py | XLM_ID0740_for_PyTorch/setup.py |glample@fb.com, aconneau@fb.com|setuptools的author_email配置选项| -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/trainer.py | XLM_ID0740_for_PyTorch/xlm/trainer.py | https://github.com/NVIDIA/apex/issues/250 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/slurm.py | XLM_ID0740_for_PyTorch/xlm/slurm.py | http://pytorch.apachecn.org/en/0.3.0/distributed.html#environment-variable-initialization | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/optim.py | XLM_ID0740_for_PyTorch/xlm/optim.py | https://github.com/pytorch/pytorch/blob/master/torch/optim/adam.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/optim.py | XLM_ID0740_for_PyTorch/xlm/optim.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/model/transformer.py | XLM_ID0740_for_PyTorch/xlm/model/transformer.py | https://arxiv.org/abs/1606.08415 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/model/transformer.py | XLM_ID0740_for_PyTorch/xlm/model/transformer.py | https://github.com/huggingface/pytorch-openai-transformer-lm/blob/master/model_pytorch.py#L14 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/model/transformer.py | XLM_ID0740_for_PyTorch/xlm/model/transformer.py | https://github.com/huggingface/pytorch-pretrained-BERT/blob/master/modeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/model/pretrain.py | XLM_ID0740_for_PyTorch/xlm/model/pretrain.py | https://github.com/facebookresearch/fastText | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/xlm/model/memory/utils.py | XLM_ID0740_for_PyTorch/xlm/model/memory/utils.py | https://github.com/facebookresearch/faiss/blob/master/gpu/test/test_pytorch_faiss.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/tools/lowercase_and_remove_accent.py | XLM_ID0740_for_PyTorch/tools/lowercase_and_remove_accent.py | https://github.com/benjaminp/six | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/setup.py | XLM_ID0740_for_PyTorch/setup.py | glample@fb.c | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-wiki.sh | XLM_ID0740_for_PyTorch/get-data-wiki.sh | https://dumps.wikimedia.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Far-en.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fbg-en.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fde-en.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fel-en.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-es.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=EUbookshop%2Fen-es.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-fr.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=EUbookshop%2Fen-fr.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-ru.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-tr.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2018%2Fen-ur.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-para.sh | XLM_ID0740_for_PyTorch/get-data-para.sh | http://opus.nlpl.eu/download.php?f=OpenSubtitles2016%2Fen-zh.txt.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.de.shuffled.v2.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt16/translation-task/news.2015.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt17/translation-task/news.2016.de.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt18/translation-task/news.2017.de.shuffled.deduped.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.en.shuffled.v2.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt16/translation-task/news.2015.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt17/translation-task/news.2016.en.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt18/translation-task/news.2017.en.shuffled.deduped.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2010.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2011.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2012.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2013.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt15/training-monolingual-news-crawl-v2/news.2014.fr.shuffled.v2.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt17/translation-task/news.2015.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt17/translation-task/news.2016.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/get-data-nmt.sh | XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt17/translation-task/news.2017.fr.shuffled.gz | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/.gitignore | XLM_ID0740_for_PyTorch/.gitignore | https://www.gitignore.io/api/python | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/XLM/blob/main/.gitignore | XLM_ID0740_for_PyTorch/.gitignore | https://www.gitignore.io/?templates=python | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-glue.sh | https://firebasestorage.googleapis.com/v0/b/mtl-sentence-representations.appspot.com/o/data%2F | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt18/translation-task/dev.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2009.fr.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.fr.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.en.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2008.de.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.fr.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.en.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://www.statmt.org/wmt14/training-monolingual-news-crawl/news.2007.de.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-nmt.sh | http://data.statmt.org/wmt16/translation-task/news.2015.ro.shuffled.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-para.sh | http://www.cfilt.iitb.ac.in/iitb_parallel/iitb_corpus_download/parallel.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-wiki.sh | https://dumps.wikimedia.org/${lg}wiki/latest/$WIKI_DUMP_NAME | 数据集说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-xnli.sh | https://dl.fbaipublicfiles.com/XNLI/XNLI-MT-1.0.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/get-data-xnli.sh | https://dl.fbaipublicfiles.com/XNLI/XNLI-1.0.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/install-tools.sh | https://nlp.stanford.edu/software/stanford-segmenter-2018-10-16.zip | 模型相关配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/prepare-glue.sh | https://dl.fbaipublicfiles.com/XLM/vocab_en | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/prepare-glue.sh | https://dl.fbaipublicfiles.com/XLM/codes_en | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/prepare-xnli.sh | https://dl.fbaipublicfiles.com/XLM/vocab_xnli_15 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/prepare-xnli.sh | https://dl.fbaipublicfiles.com/XLM/codes_xnli_15 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/XLM_ID0740_for_PyTorch/setup.py | glample@fb.com","aconneau@fb.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/public_address_statement.md b/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/public_address_statement.md index b31d39594270ee0e536e5b56e6ae0fe825184525..7aecdc223e957f9e31174af5606daf7876304a2b 100644 --- a/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/public_address_statement.md @@ -1,126 +1,59 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.gitmodules|mBART_ID2372_for_PyTorch/.gitmodules |https://github.com/ngoyal2707/Megatron-LM|Megatron-LM在开源社区中的url链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|mBART_ID2372_for_PyTorch/setup.py |https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl|setuptools的torch-cpu开源whl下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py|mBART_ID2372_for_PyTorch/setup.py |https://github.com/pytorch/fairseq|setuptools的fairseq开源下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py |mBART_ID2372_for_PyTorch/docs/conf.py |https://github.com/pytorch/fairseq/tree/master/docs/|conf文件中的fairseq开源引用链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|mBART_ID2372_for_PyTorch/docs/conf.py |http://docs.scipy.org/doc/numpy|conf文件中的numpy开源链接引用| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|mBART_ID2372_for_PyTorch/docs/conf.py |https://docs.python.org|conf文件中的python开源链接引用| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|mBART_ID2372_for_PyTorch/docs/conf.py |https://pytorch.org/docs/master|conf文件中的torch开源链接引用| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|mBART_ID2372_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json|gpt2_bpe文件默认encoder.json开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe.py|mBART_ID2372_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py |https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe|gpt2_bpe文件默认vocab.bpe开源社区链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|mBART_ID2372_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz|fconv_self_att权重stories_checkpoint开源社区tar.gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|mBART_ID2372_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz|fconv_self_att权重stories_checkpoint开源社区tar.gz链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|mBART_ID2372_for_PyTorch/fairseq/models/fconv_self_att.py |https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2|fconv_self_att数据集stories_test开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|mBART_ID2372_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2|wmt14.v2.en-fr.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|mBART_ID2372_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2|wmt14.en-de.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv.py|mBART_ID2372_for_PyTorch/fairseq/models/fconv.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2|wmt17.v2.en-de.fconv-py开源社区tar.bz2链接配置| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz|'lightconv.no_glu.iwslt14.de-en'在开源社区上的iwslt14.de-en.lightconv.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz|'dynamicconv.no_glu.iwslt14.de-en'在开源社区上的iwslt14.de-en.dynamicconv.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz|'lightconv.no_glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz|'dynamicconv.no_glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt17.en-de'在开源社区上的wmt16.en-de.joined-dict.lightconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt17.en-de'在开源社区上的wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz|'lightconv.glu.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.lightconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz|'lightconv.glu.wmt17.zh-en'在开源社区上的wmt17.zh-en.lightconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py |https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz|'dynamicconv.glu.wmt17.zh-en'在开源社区上的wmt17.zh-en.dynamicconv-glu.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2|"transformer_lm.gbw.adaptive_huge"在开源社区上的adaptive_lm_gbw_huge.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2|"transformer_lm.wiki103.adaptive"在开源社区上的adaptive_lm_wiki103.v2.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2|wmt19.en在开源社区中的tar.bz2链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2|wmt19.de在开源社区中的tar.bz2链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py |https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2|wmt19.ru在开源社区中的tar.bz2链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2|'transformer.wmt14.en-fr'在开源社区上的wmt14.en-fr.joined-dict.transformer.tar.bz2'的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2|'transformer.wmt16.en-de'在开源社区上的wmt16.en-de.joined-dict.transformer.tar.bz2的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz|'transformer.wmt18.en-de'在开源社区上的wmt18.en-de.ensemble.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz|'transformer.wmt19.en-de'在开源社区上的wmt19.en-de.joined-dict.ensemble.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz|'transformer.wmt19.en-ru'在开源社区上的wmt19.en-ru.ensemble.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz|'transformer.wmt19.de-en'在开源社区上的wmt19.de-en.joined-dict.ensemble.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz|'transformer.wmt19.ru-en'在开源社区上的wmt19.ru-en.ensemble.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz|'transformer.wmt19.en-de.single_model'在开源社区上的wmt19.en-de.joined-dict.single_model.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz|'transformer.wmt19.en-ru.single_model'在开源社区上的wmt19.en-ru.single_model.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz|'transformer.wmt19.de-en.single_model'在开源社区上的wmt19.de-en.joined-dict.single_model.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|mBART_ID2372_for_PyTorch/fairseq/models/transformer.py |https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz|'transformer.wmt19.ru-en.single_model'在开源社区上的wmt19.ru-en.single_model.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz|"bart.base"在开源社区上的bart.base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz|"bart.large"在开源社区上的bart.large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz|"bart.large.mnli"在开源社区上的bart.large.mnli.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz|"bart.large.cnn"在开源社区上的bart.large.cnn.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py |http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz|"bart.large.xsum"在开源社区上的bart.large.xsum.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz|"camembert"在开源社区上的camembert-base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz|"camembert.v0"在开源社区上的camembert-base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz|"camembert-base"在开源社区上的camembert-base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz|"camembert-large"在开源社区上的camembert-large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz|"camembert-base-ccnet"在开源社区上的camembert-base-ccnet.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz|"camembert-base-ccnet-4gb"在开源社区上的camembert-base-ccnet-4gb.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz|"camembert-base-wikipedia-4gb"在开源社区上的camembert-base-wikipedia-4gb.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py |http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz|"camembert-base-oscar-4gb"在开源社区上的camembert-base-oscar-4gb.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_xlmr.py |http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz|"xlmr.base"在开源社区上的xlmr.base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_xlmr.py |http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz|"xlmr.large"在开源社区上的xlmr.large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz|"roberta.base"在开源社区上的roberta.base.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz|"roberta.large"在开源社区上的roberta.large.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz|"roberta.large.mnli"在开源社区上的roberta.large.mnli.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py |http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz|"roberta.large.wsc"在开源社区上的roberta.large.wsc.tar.gz的下载链接| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/tests/speech_recognition/asr_test_base.py | mBART_ID2372_for_PyTorch/tests/speech_recognition/asr_test_base.py | https://fburl.com/batch_first_example | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | mBART_ID2372_for_PyTorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py | mBART_ID2372_for_PyTorch/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | mBART_ID2372_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | mBART_ID2372_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py | mBART_ID2372_for_PyTorch/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md | mBART_ID2372_for_PyTorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/translation_lev.py | mBART_ID2372_for_PyTorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/cross_lingual_language_model/README.md | mBART_ID2372_for_PyTorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | mBART_ID2372_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md | mBART_ID2372_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py | mBART_ID2372_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/search.py | mBART_ID2372_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md | mBART_ID2372_for_PyTorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | mBART_ID2372_for_PyTorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | mBART_ID2372_for_PyTorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | mBART_ID2372_for_PyTorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | mBART_ID2372_for_PyTorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | mBART_ID2372_for_PyTorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/bmuf.py | mBART_ID2372_for_PyTorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | mBART_ID2372_for_PyTorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | mBART_ID2372_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | mBART_ID2372_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py | mBART_ID2372_for_PyTorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adafactor.py | mBART_ID2372_for_PyTorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/vggblock.py | mBART_ID2372_for_PyTorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/sparse_multihead_attention.py | mBART_ID2372_for_PyTorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md | mBART_ID2372_for_PyTorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/gelu.py | mBART_ID2372_for_PyTorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md | mBART_ID2372_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/dynamic_crf_layer.py | mBART_ID2372_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/character_token_embedder.py | mBART_ID2372_for_PyTorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/adaptive_softmax.py | mBART_ID2372_for_PyTorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/convolution.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md | mBART_ID2372_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py | mBART_ID2372_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py | mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md | mBART_ID2372_for_PyTorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fairseq_incremental_decoder.py | mBART_ID2372_for_PyTorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | mBART_ID2372_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md | mBART_ID2372_for_PyTorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/file_utils.py | mBART_ID2372_for_PyTorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/README.md | mBART_ID2372_for_PyTorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe_utils.py | mBART_ID2372_for_PyTorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/speech_to_text_dataset.py | mBART_ID2372_for_PyTorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/feature_transforms/specaugment.py | mBART_ID2372_for_PyTorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/adaptive_softmax.py | mBART_ID2372_for_PyTorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libnat_cuda/binding.cpp | mBART_ID2372_for_PyTorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md | mBART_ID2372_for_PyTorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/make.bat | mBART_ID2372_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py | mBART_ID2372_for_PyTorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_speech/benchmarking/README.md | mBART_ID2372_for_PyTorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py | mBART_ID2372_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数配置 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt19.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/nlp/mBART_ID2372_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/others/DCN_for_PyTorch/public_address_statement.md b/PyTorch/built-in/others/DCN_for_PyTorch/public_address_statement.md index caabcb47dae291902ed29cd36dea3bb472130fe4..c8292568ba2610a42de72783bc5adeb9681671eb 100644 --- a/PyTorch/built-in/others/DCN_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/others/DCN_for_PyTorch/public_address_statement.md @@ -1,70 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ | ------------------------------------------------------------------------------- | ------------------------ | ------------------------------------------------- | -------------------------------- | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/utils.py | DCN_for_PyTorch/deepctr_torch/utils.py | https://pypi.python.org/pypi/deepctr-torch/json | 使用json获取pypi上的包的版本 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/setup.py | https://github.com/shenweichen/deepctr-torch | setuptools的url配置选项 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/setup.py | https://github.com/shenweichen/deepctr-torch/tags | setuptools的download_url配置选项 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/setup.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/setup.py | https://github.com/shenweichen/deepctr-torch | 源码实现 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/setup.py | https://github.com/shenweichen/deepctr-torch/tags | 源码实现 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/docs/source/conf.py | DCN_for_PyTorch/docs/source/conf.py | http://www.sphinx-doc.org/en/master/config | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/docs/make.bat | DCN_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/utils.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/utils.py | DCN_for_PyTorch/deepctr_torch/utils.py | https://github.com/shenweichen/DeepCTR-Torch/releases/tag/v | 源码实现 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/utils.py | DCN_for_PyTorch/deepctr_torch/utils.py | https://pypi.org/project/deepctr-torch/#history | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/deepfm.py | DCN_for_PyTorch/deepctr_torch/models/xdeepfm.py | https://arxiv.org/abs/1703.04247 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/wdl.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/wdl.py | https://arxiv.org/pdf/1606.07792.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/pnn.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/pnn.py | https://arxiv.org/pdf/1611.00144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/onn.py | https://arxiv.org/pdf/1904.12579 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/nfm.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/nfm.py | DCN_for_PyTorch/deepctr_torch/models/nfm.py | https://arxiv.org/abs/1708.05027 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/mlr.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/mlr.py | https://arxiv.org/abs/1704.05194 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dcnmix.py | DCN_for_PyTorch/deepctr_torch/models/ifm.py | zanshuxun@aliyun.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/ifm.py | https://www.ijcai.org/Proceedings/2019/0203.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/din.py | https://arxiv.org/pdf/1706.06978.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dcnmix.py | DCN_for_PyTorch/deepctr_torch/models/difm.py | zanshuxun@aliyun.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/difm.py | https://www.ijcai.org/Proceedings/2020/0434.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dien.py | DCN_for_PyTorch/deepctr_torch/models/dien.py | wangze0801@126.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/dien.py | https://arxiv.org/pdf/1809.03672.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/deepfm.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/deepfm.py | DCN_for_PyTorch/deepctr_torch/models/deepfm.py | https://arxiv.org/abs/1703.04247 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dcnmix.py | DCN_for_PyTorch/deepctr_torch/models/dcnmix.py | bgasdo36977@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dcnmix.py | DCN_for_PyTorch/deepctr_torch/models/dcnmix.py | zanshuxun@aliyun.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/dcnmix.py | https://arxiv.org/abs/1708.05123 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/dcnmix.py | https://arxiv.org/abs/2008.13535 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dcnmix.py | DCN_for_PyTorch/deepctr_torch/models/dcn.py | bgasdo36977@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/dcnmix.py | DCN_for_PyTorch/deepctr_torch/models/dcn.py | zanshuxun@aliyun.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/dcn.py | https://arxiv.org/abs/1708.05123 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/dcn.py | https://arxiv.org/abs/2008.13535 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/ccpm.py | DCN_for_PyTorch/deepctr_torch/models/ccpm.py | kk163mail@126.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/ccpm.py | http://ir.ia.ac.cn/bitstream/173211/12337/1/A%20Convolutional%20Click%20Prediction%20Model.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/basemodel.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/basemodel.py | DCN_for_PyTorch/deepctr_torch/models/basemodel.py | https://tensorflow.google.cn/api_docs/python/tf/keras/callbacks | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/basemodel.py | DCN_for_PyTorch/deepctr_torch/models/basemodel.py | https://pytorch.org/docs/stable/optim.html | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/basemodel.py | DCN_for_PyTorch/deepctr_torch/models/basemodel.py | https://pytorch.org/docs/stable/nn.functional.html#loss-functions | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/autoint.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/models/autoint.py | https://arxiv.org/abs/1810.11921 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/models/afm.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/models/afm.py | DCN_for_PyTorch/deepctr_torch/models/afm.py | https://arxiv.org/abs/1708.04617 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/layers/utils.py | weichenswc@163.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/sequence.py | https://arxiv.org/pdf/1706.06978.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/layers/interaction.py | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://www.csie.ntu.edu.tw/~b97053/paper/Rendle2010FM.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/layers/interaction.py | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | http://arxiv.org/abs/1708.05027 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/pdf/1905.09433.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/pdf/1905.09433.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/pdf/1803.05170.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/layers/interaction.py | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/pdf/1708.04617.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/abs/1810.11921 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/abs/1708.05123 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/abs/2008.13535 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/abs/2008.13535 | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/pdf/1611.00144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/layers/interaction.py | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://github.com/Atomu2014/product-nets | 源码实现 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | https://arxiv.org/pdf/1611.00144.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/interaction.py | http://ir.ia.ac.cn/bitstream/173211/12337/1/A%20Convolutional%20Click%20Prediction%20Model.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/core.py | https://arxiv.org/pdf/1706.06978.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/README.md | DCN_for_PyTorch/deepctr_torch/layers/activation.py | https://arxiv.org/pdf/1706.06978.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/deepctr_torch/layers/activation.py | DCN_for_PyTorch/deepctr_torch/layers/activation.py | https://github.com/zhougr1993/DeepInterestNetwo | 源码实现 | -| 开源代码引入 | https://github.com/shenweichen/DeepCTR-Torch/blob/master/setup.py | DCN_for_PyTorch/deepctr_torch/inputs.py | weichenswc@163.com | 开发者邮箱配置 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------|--------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/others/DCN_for_PyTorch/setup.py | weichenswc@163.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/others/DLRM_for_PyTorch/public_address_statement.md b/PyTorch/built-in/others/DLRM_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..ac4923331412925d449e44f332b9eef888f50a26 --- /dev/null +++ b/PyTorch/built-in/others/DLRM_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/others/DLRM_for_PyTorch/data_utils.py | https://labs.criteo.com/2014/02/kaggle-display-advertising-challenge-dataset | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/DLRM_for_PyTorch/data_utils.py | https://labs.criteo.com/2013/12/download-terabyte-click-logs | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/DLRM_for_PyTorch/torchrec_dlrm/scripts/download_Criteo_1TB_Click_Logs_dataset.sh | http://www.vision.caltech.edu/Image_Datasets/Caltech101/Annotations.tar | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/built-in/others/DeCLIP_for_PyTorch/public_address_statement.md b/PyTorch/built-in/others/DeCLIP_for_PyTorch/public_address_statement.md index b228990f4d8f659d5234e746489b949f794565c1..28974954bdbc57501e67c33227ee63f47326c78a 100644 --- a/PyTorch/built-in/others/DeCLIP_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/others/DeCLIP_for_PyTorch/public_address_statement.md @@ -1,5 +1,6 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ | ------------------------------------------------------ | ----------------- | -------------------------------------------------------------- | -------------------------------- | -| 开源代码引入 | https://github.com/Sense-GVT/DeCLIP/blob/main/setup.py | ./DeCLIP/setup.py | yuankun@sensetime.com | setuptools的author_email配置选项 | -| 开源代码引入 | https://github.com/Sense-GVT/DeCLIP/blob/main/setup.py | ./DeCLIP/setup.py | http://gitlab.bj.sensetime.com/spring-ce/element/prototype.git | setuptools的url配置选项 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------|-----------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/others/DeCLIP_for_PyTorch/DeCLIP/prototype/tools/inference.py | https://arxiv.org/abs/1610.02391 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/DeCLIP_for_PyTorch/DeCLIP/prototype/utils/nnie_helper.py | http://spring.sensetime.com/docs/nart/tutorial/switch/nnie.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/others/DeCLIP_for_PyTorch/DeCLIP/setup.py | yuankun@sensetime.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/built-in/others/DeCLIP_for_PyTorch/DeCLIP/setup.py | http://gitlab.bj.sensetime.com/spring-ce/element/prototype.git | 配置相关说明 | \ No newline at end of file diff --git a/PyTorch/built-in/others/GLIP_for_PyTorch/public_address_statement.md b/PyTorch/built-in/others/GLIP_for_PyTorch/public_address_statement.md index 5af0c76e1e2f86060d64689466409188fd9ad5e8..43f1ae82b15d1b304c113392b1ab36fcbe125965 100644 --- a/PyTorch/built-in/others/GLIP_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/others/GLIP_for_PyTorch/public_address_statement.md @@ -1,74 +1,19 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ |-------------------------------------------------------------------------------|--------------------------------------|-------------------------------------------------------------------------------------------------------------------------------|--------------------------| -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/odinw/download.py | ./odinw/download.py | https://huggingface.co/GLIPModel/GLIP/tree/main/odinw_35 | ODINW_35开源数据集在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/build.py | ./maskrcnn_benchmark/data/build.py | https://github.com/facebookresearch/Detectron/blob/master/configs/getting_started/tutorial_1gpu_e2e_faster_rcnn_R-50-FPN.yaml | 警告信息查询 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/tools/cityscapes/instances2dict_with_polygons.py | ./tools/cityscapes/instances2dict_with_polygons.py | https://github.com/facebookresearch/Detectron/issues/111 | 源码相关说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/tools/cityscapes/convert_cityscapes_to_coco.py | ./tools/cityscapes/convert_cityscapes_to_coco.py | https://github.com/facebookresearch/Detectron/tree/master/tools | 源码相关说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/utils/stats.py | ./maskrcnn_benchmark/utils/stats.py | https://opensource.org/licenses/MIT | MIT许可公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/utils/model_zoo.py | ./maskrcnn_benchmark/utils/model_zoo.py | https://github.com/pytorch/pytorch/blob/master/torch/utils/model_zoo.py | 公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/utils/model_zoo.py | ./maskrcnn_benchmark/utils/model_zoo.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 模型权重公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/utils/model_serialization.py | ./maskrcnn_benchmark/utils/model_serialization.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 模型权重公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/utils/imports.py | ./maskrcnn_benchmark/utils/imports.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path?utm_medium=organic&utm_source=google_rich_qa&utm_campaign=google_rich_qa | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/structures/boxlist_ops.py | ./maskrcnn_benchmark/structures/boxlist_ops.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/structures/boxlist_ops.py | ./maskrcnn_benchmark/structures/boxlist_ops.py | https://github.com/chainer/chainercv/blob/master/chainercv/utils/bbox/bbox_iou.py | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/rpn/loss.py | ./maskrcnn_benchmark/modeling/rpn/loss.py | https://github.com/yqyao/FCOS_PLUS/blob/0d20ba34ccc316650d8c30febb2eb40cb6eaae37/maskrcnn_benchmark/modeling/rpn/fcos/loss.py#L42 | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/rpn/dyhead.py | ./maskrcnn_benchmark/modeling/rpn/dyhead.py | https://github.com/ucbdrive/few-shot-object-detection/blob/master/fsdet/modeling/roi_heads/fast_rcnn.py#L448-L464 | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint_v1.py | ./maskrcnn_benchmark/modeling/backbone/swint_v1.py | https://github.com/SwinTransformer/Swin-Transformer-Object-Detection/blob/master/mmdet/models/backbones/swin_transformer.py | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint_v1.py | ./maskrcnn_benchmark/modeling/backbone/swint_v1.py | https://arxiv.org/pdf/2103.14030 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint_v2_v1.py | ./maskrcnn_benchmark/modeling/backbone/swint_v2_v1.py | https://github.com/SwinTransformer/Swin-Transformer-Object-Detection/blob/master/mmdet/models/backbones/swin_transformer.py | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint_v2_v1.py | ./maskrcnn_benchmark/modeling/backbone/swint_v2_v1.py | https://arxiv.org/pdf/2103.14030 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint_v2.py | ./maskrcnn_benchmark/modeling/backbone/swint_v2.py | https://github.com/SwinTransformer/Swin-Transformer-Object-Detection/blob/master/mmdet/models/backbones/swin_transformer.py | 公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint_v2.py | ./maskrcnn_benchmark/modeling/backbone/swint_v2.py | https://arxiv.org/pdf/2103.14030 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint.py | ./maskrcnn_benchmark/modeling/backbone/swint.py | https://github.com/SwinTransformer/Swin-Transformer-Object-Detection/blob/master/mmdet/models/backbones/swin_transformer.py | 模型源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/swint.py | ./maskrcnn_benchmark/modeling/backbone/swint.py | https://arxiv.org/pdf/2103.14030 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientnet.py | ./maskrcnn_benchmark/modeling/backbone/efficientnet.py | https://arxiv.org/abs/1905.11946 | 参考论文公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientnet.py | ./maskrcnn_benchmark/modeling/backbone/efficientnet.py | https://arxiv.org/abs/1911.09665 | 参考论文公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientnet.py | ./maskrcnn_benchmark/modeling/backbone/efficientnet.py | https://arxiv.org/abs/1905.11946 | 参考论文公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://stackoverflow.com/a/18348004 | 源码说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b0-355c32eb.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b1-f1951068.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b2-8bb594d6.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b3-5fb5a3c3.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b4-6ed6700e.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b5-b6417697.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b6-c76e70fd.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b7-dcc49843.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b0-b64d5a18.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b1-0f3ce85a.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b2-6e9d97e5.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b3-cdd7c0f4.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b4-44fb3a87.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b5-86493f6b.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b6-ac80338e.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b7-4652b6dd.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/backbone/efficientdet.py | ./maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b8-22a8fe65.pth | 开源权重在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/sigmoid_focal_loss.py | ./maskrcnn_benchmark/modeling/layers/sigmoid_focal_loss.py | https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/focal_loss.py | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/sigmoid_focal_loss.py | ./maskrcnn_benchmark/modeling/layers/sigmoid_focal_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/set_loss.py | ./maskrcnn_benchmark/modeling/layers/set_loss.py | https://giou.stanford.edu/ | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/set_loss.py | ./maskrcnn_benchmark/modeling/layers/set_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/misc.py | ./maskrcnn_benchmark/modeling/layers/misc.py | https://github.com/pytorch/pytorch/issues/12013 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/dropblock.py | ./maskrcnn_benchmark/modeling/layers/dropblock.py | https://arxiv.org/abs/1810.12890 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/modeling/layers/deform_pool.py | ./maskrcnn_benchmark/modeling/layers/deform_pool.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/transforms/transforms.py | ./maskrcnn_benchmark/data/transforms/transforms.py | https://medium.com/uruvideo/dataset-augmentation-with-random-homographies-a8f4b44830d4 | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/datasets/voc.py | ./maskrcnn_benchmark/datasets/voc.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/pascal_voc.py#L208-L211 | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/modulated_coco.py | ./maskrcnn_benchmark/datasets/modulated_coco.py | http://mscoco.org/dataset/#detections-challenge2016 | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/modulated_coco.py | ./maskrcnn_benchmark/datasets/modulated_coco.py | https://github.com/python-pillow/Pillow/issues/835 | 源码包地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/imagenet.py | ./maskrcnn_benchmark/datasets/imagenet.py | https://github.com/python-pillow/Pillow/issues/835 | 第三方库地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/evaluation/voc/voc_eval.py | ./maskrcnn_benchmark/data/datasets/evaluation/voc/voc_eval.py | https://github.com/chainer/chainercv/blob/master/chainercv/evaluations/eval_detection_voc.py | voc_eval公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/evaluation/vg/vg_eval.py | ./maskrcnn_benchmark/data/datasets/evaluation/voc/voc_eval.py | https://github.com/chainer/chainercv/blob/master/chainercv/evaluations/eval_detection_voc.py | vg_eval公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/evaluation/lvis/lvis_eval.py | ./maskrcnn_benchmark/data/datasets/evaluation/lvis/lvis_eval.py | https://github.com/achalddave/large-vocab-devil/blob/9aaddc15b00e6e0d370b16743233e40d973cd53f/scripts/evaluate_ap_fixed.py | lvis_eval公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/evaluation/flickr/flickr_eval.py | ./maskrcnn_benchmark/data/datasets/evaluation/flickr/flickr_eval.py | https://github.com/BryanPlummer/flickr30k_entities/blob/68b3d6f12d1d710f96233f6bd2b6de799d6f4e5b/flickr30k_entities_utils.py | flickr_eval公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/evaluation/flickr/flickr_eval.py | ./maskrcnn_benchmark/data/datasets/evaluation/flickr/flickr_eval.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | flickr_eval公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/coco_dt.py | ./maskrcnn_benchmark/data/datasets/coco_dt.py | https://github.com/pytorch/vision/blob/13b35ff/references/detection/coco_utils.py | 数据集处理公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/coco.py | ./maskrcnn_benchmark/data/datasets/coco.py | https://github.com/python-pillow/Pillow/issues/835 | 数据集处理公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/data/datasets/coco.py | ./maskrcnn_benchmark/data/datasets/coco.py | http://mscoco.org/dataset/#detections-challenge2016 | 数据集处理公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/SigmoidFocalLoss_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/SigmoidFocalLoss_cuda.cu | https://github.com/pytorch/pytorch/blob/master/modules/detectron/sigmoid_focal_loss_op.cu | SigmoidFocalLoss_cuda地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/SigmoidFocalLoss_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/SigmoidFocalLoss_cuda.cu | cyfu@cs.unc.edu | 邮箱配置公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/deform_pool_kernel_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/deform_pool_kernel_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/cuda/deform_psroi_pooling_cuda.cu | deform_pool_kernel_cuda地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/deform_pool_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/deform_pool_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/modulated_dcn_cuda.c | deform_pool_cuda地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/deform_pool_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/deform_pool_cuda.cu | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | deform_pool_cuda地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/deform_conv_kernel_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/deform_conv_kernel_cuda.cu | https://arxiv.org/abs/1703.06211 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/deform_conv_kernel_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/deform_conv_kernel_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | deform_conv_kernel_cuda地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/csrc/cuda/deform_conv_cuda.cu | ./maskrcnn_benchmark/csrc/cuda/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | deform_conv_cuda地址公网来源说明 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/config/paths_catalog.py | ./maskrcnn_benchmark/config/paths_catalog.py | https://dl.fbaipublicfiles.com/detectron | S3_C2_DETECTRON_URL在开源网站上的地址 | -| 开源代码引入 | https://github.com/microsoft/GLIP/blob/main/maskrcnn_benchmark/config/defaults.py | ./maskrcnn_benchmark/config/defaults.py | https://github.com/ucbdrive/few-shot-object-detection/blob/master/fsdet/modeling/roi_heads/fast_rcnn.py#L448-L464 | fast_rcnn.py地址公网来源说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b8-22a8fe65.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b7-4652b6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b7-dcc49843.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b6-ac80338e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b6-c76e70fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b5-86493f6b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b5-b6417697.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b4-44fb3a87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b4-6ed6700e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b3-cdd7c0f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b3-5fb5a3c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b2-6e9d97e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b2-8bb594d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b1-0f3ce85a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b1-f1951068.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/advprop/efficientnet-b0-b64d5a18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/GLIP_for_PyTorch/maskrcnn_benchmark/modeling/backbone/efficientdet.py | https://publicmodels.blob.core.windows.net/container/aa/efficientnet-b0-355c32eb.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/built-in/others/OpenFold_for_PyTorch/public_address_statement.md b/PyTorch/built-in/others/OpenFold_for_PyTorch/public_address_statement.md index a78a02e41f95e05c5276abbf277e6d648d0411ef..7b644e44b9a7a99eca7013a4d8fec12a4761e71d 100644 --- a/PyTorch/built-in/others/OpenFold_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/others/OpenFold_for_PyTorch/public_address_statement.md @@ -1,56 +1,46 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |----------------------------------------------------------------------------------------------------|----------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------| -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/environment.yml | OpenFold_for_PyTorch/environment.yml | git+https://github.com/NVIDIA/dllogger.git | dllogger在开源社区上的git下载链接 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/environment.yml | OpenFold_for_PyTorch/environment.yml | git+https://github.com/Dao-AILab/flash-attention.git@5b838a8 | flash-attention在开源社区上的git下载链接 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/setup.py | OpenFold_for_PyTorch/setup.py | https://github.com/aqlaboratory/openfold | openfold在开源社区上的git下载链接 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-8283-5324 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-6524-874X | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-6079-7627 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-9777-6192 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-9949-069X | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4631-0947 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3093-3587 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-9714-2827 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3351-9584 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4942-9652 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4942-9652 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-0759-2080 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-2661-4388 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-8228-1042 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-1617-1720 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-2820-0050 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3657-8053 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-1909-0961 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-3698-3574 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-3698-3574 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3364-1838 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-5921-0035 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4354-7906 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-6817-1322 | orcid在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/CITATION.cff | OpenFold_for_PyTorch/CITATION.cff | https://doi.org/10.1101/2022.11.20.517210 | doi在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/Dockerfile | OpenFold_for_PyTorch/Dockerfile | https://github.com/aqlaboratory/openfold | openfold在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/Dockerfile | OpenFold_for_PyTorch/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | openfold使用的镜像在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/Dockerfile | OpenFold_for_PyTorch/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | openfold使用的镜像在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/Dockerfile | OpenFold_for_PyTorch/Dockerfile | https://github.com/conda-forge/miniforge/releases/download/23.3.1-1/Miniforge3-Linux-x86_64.sh | conda-forge在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/Dockerfile | OpenFold_for_PyTorch/Dockerfile | https://git.scicore.unibas.ch/schwede/openstructure/-/raw/7102c63615b64735c4941278d92b554ec94415f8/modules/mol/alg/src/stereo_chemical_props.txt | openfold使用的镜像在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/README.md | OpenFold_for_PyTorch/README.md | https://github.com/deepmind/alphafold | AlphaFold 2在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/README.md | OpenFold_for_PyTorch/README.md | https://openfold.readthedocs.io/en/latest/ | openfold文档在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/README.md | OpenFold_for_PyTorch/README.md | https://github.com/aqlaboratory/openfold/blob/main/docs/source/original_readme.md | openfold文档在开源社区上的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/README.md | OpenFold_for_PyTorch/README.md | https://www.biorxiv.org/content/10.1101/2022.11.20.517210 | openfold论文地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/README.md | OpenFold_for_PyTorch/README.md | https://www.biorxiv.org/content/early/2022/11/22/2022.11.20.517210.full.pdf | openfold论文地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/README.md | OpenFold_for_PyTorch/README.md | https://www.nature.com/articles/s41586-021-03819-2 | openfold论文地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_alphafold_params.sh | OpenFold_for_PyTorch/scripts/download_alphafold_params.sh | https://storage.googleapis.com/alphafold/alphafold_params_2022-12-06.tar | alphafold权重地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_bfd.sh | OpenFold_for_PyTorch/scripts/download_bfd.sh | https://bfd.mmseqs.com/bfd_metaclust_clu_complete_id30_c90_final_seq.sorted_opt.tar.gz | bfd数据地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_bfd.sh | OpenFold_for_PyTorch/scripts/download_bfd.sh | https://storage.googleapis.com/alphafold-databases/casp14_versions/bfd_metaclust_clu_complete_id30_c90_final_seq.sorted_opt.tar.gz | alphafold数据地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_cameo.py | OpenFold_for_PyTorch/scripts/download_cameo.py | https://www.cameo3d.org/ | cameo3d地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_cameo.py | OpenFold_for_PyTorch/scripts/download_cameo.py | https://files.rcsb.org/view/{pdb_id.upper()}.cif | rcsb地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_mgnify.sh | OpenFold_for_PyTorch/scripts/download_mgnify.sh | https://storage.googleapis.com/alphafold-databases/v2.3/mgy_clusters_2022_05.fa.gz | mgy地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_openfold_params_gdrive.sh | OpenFold_for_PyTorch/scripts/download_openfold_params_gdrive.sh | https://docs.google.com/uc?export=download&id=${FILE_ID} | 缓存文件下载地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_openfold_params_gdrive.sh | OpenFold_for_PyTorch/scripts/download_openfold_params_gdrive.sh | https://docs.google.com/uc?export=download&confirm=${CONFIRM}&id=${FILE_ID} | 缓存文件下载地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/download_openfold_params_huggingface.sh | OpenFold_for_PyTorch/scripts/download_openfold_params_huggingface.sh | https://huggingface.co/nz/OpenFold | openfold在hf的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/fasta_to_clusterfile.py | OpenFold_for_PyTorch/scripts/fasta_to_clusterfile.py | https://github.com/soedinglab/MMseqs2/issues/452 | soedinglab issue的地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/scripts/install_hh_suite.sh | OpenFold_for_PyTorch/scripts/install_hh_suite.sh | https://github.com/soedinglab/hh-suite.git /tmp/hh-suite | soedinglab仓库地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/openfold/data/data_modules.py | OpenFold_for_PyTorch/openfold/data/data_modules.py | https://github.com/deepmind/alphafold/blob/6c4d833fbd1c6b8e7c9a21dae5d4ada2ce777e10/run_alphafold.py#L462C1-L477 | alphafold代码地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/openfold/data/data_modules.py | OpenFold_for_PyTorch/openfold/data/data_modules.py | https://www.biorxiv.org/content/10.1101/2021.10.04.463034v2.full.pdf | alphafold论文地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/openfold/data/mmcif_parsing.py | OpenFold_for_PyTorch/openfold/data/mmcif_parsing.py | http://mmcif.wwpdb.org/docs/tutorials/mechanics/pdbx-mmcif-syntax.html | PDBx/mmCIFS地址 | -| 开源代码引入 | https://github.com/aqlaboratory/openfold/blob/main/openfold/data/parsers.py | OpenFold_for_PyTorch/openfold/data/parsers.py | http://eddylab.org/software/hmmer/Userguide.pdf | HMMER文档地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://doi.org/10.1101/2022.11.20.517210 | doi在开源社区上的地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4942-9652 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4942-9652 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4631-0947 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-4354-7906 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-3698-3574 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-3698-3574 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-2661-4388 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-1617-1720 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0003-0759-2080 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-9949-069X | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-9777-6192 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-9714-2827 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-6524-874X | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-6079-7627 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3657-8053 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3364-1838 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3351-9584 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-3093-3587 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-2820-0050 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0002-1909-0961 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-8283-5324 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-8228-1042 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-6817-1322 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/CITATION.cff | https://orcid.org/0000-0001-5921-0035 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/7fa2af80.pub | 公钥链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 公钥链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/openfold/data/mmcif_parsing.py | http://mmcif.wwpdb.org/docs/tutorials/mechanics/pdbx-mmcif-syntax.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_alphafold_params.sh | https://storage.googleapis.com/alphafold/alphafold_params_2022-12-06.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_bfd.sh | https://storage.googleapis.com/alphafold-databases/casp14_versions/bfd_metaclust_clu_complete_id30_c90_final_seq.sorted_opt.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_cameo.py | https://files.rcsb.org/view/{pdb_id.upper()}.cif | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_cameo.py | https://www.cameo3d.org/ | cameo3d地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_colabfold_envdb.sh | http://wwwuser.gwdg.de/~compbiol/colabfold/colabfold_envdb_202108.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_mgnify.sh | https://storage.googleapis.com/alphafold-databases/v2.3/mgy_clusters_2022_05.fa.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_pdb_mmcif.sh | https://storage.googleapis.com/criteo-cail-datasets/day_ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_pdb_seqres.sh | ftp://ftp.wwpdb.org/pub/pdb/derived_data/pdb_seqres.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_pdb70.sh | http://wwwuser.gwdg.de/~compbiol/data/hhsuite/databases/hhsuite_dbs/old-releases/pdb70_from_mmcif_200401.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_small_bfd.sh | https://storage.googleapis.com/alphafold-databases/reduced_dbs/bfd-first_non_consensus_sequences.fasta.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_uniclust30.sh | https://storage.googleapis.com/alphafold-databases/casp14_versions/uniclust30_2018_08_hhsuite.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_uniprot.sh | ftp://ftp.ebi.ac.uk/pub/databases/uniprot/current_release/knowledgebase/complete/uniprot_trembl.fasta.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_uniprot.sh | ftp://ftp.ebi.ac.uk/pub/databases/uniprot/current_release/knowledgebase/complete/uniprot_sprot.fasta.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_uniref30.sh | https://storage.googleapis.com/alphafold-databases/v2.3/UniRef30_2021_03.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/scripts/download_uniref90.sh | ftp://ftp.uniprot.org/pub/databases/uniprot/uniref/uniref90/uniref90.fasta.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/built-in/others/OpenFold_for_PyTorch/setup.py | jennifer.wei@omsf.io | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/rl/MAPPO_for_PyTorch/public_address_statement.md b/PyTorch/built-in/rl/MAPPO_for_PyTorch/public_address_statement.md index 925bb54208ed57aab9e818705d8f0233560c3ff7..e1620441d6a8a560a2454a095c1f2a96c8d0db73 100644 --- a/PyTorch/built-in/rl/MAPPO_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/rl/MAPPO_for_PyTorch/public_address_statement.md @@ -1,34 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ |--------------------------------------------------------------------------------------------|-----------------------------------------|----------------------------------------------------------------------|-----------------------| -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/onpolicy/envs/starcraft2/smac_maps.py | ./onpolicy/envs/starcraft2/smac_maps.py | https://github.com/oxwhirl/smac#smac-maps | smac开源仿真环境在开源社区上的下载链接 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/setup.py | ./onpolicy/envs/starcraft2/smac_maps.py | https://packaging.python.org/guides/single-sourcing-package-version/ | packaging.python.org源码公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/setup.py | ./onpolicy/envs/starcraft2/smac_maps.py | zoeyuchao@gmail.com | 作者邮箱配置 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/utils/multi_discrete.py | ./onpolicy/utils/multi_discrete.py | https://github.com/openai/gym/blob/1fb81d4e3fb780ccf77fec731287ba07da35eb84/gym/spaces/multi_discrete.py | github.com/openai源码公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/mpe/rendering.py | ./onpolicy/envs/mpe/rendering.py | https://github.com/openai/gym-http-api/issues/2 | github.com/openai/gym-http-api/issues引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/mpe/multi_discrete.py | ./onpolicy/envs/mpe/multi_discrete.py | https://github.com/openai/gym/blob/1fb81d4e3fb780ccf77fec731287ba07da35eb84/gym/spaces/multi_discrete.py | github.com/openai/gym源码公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/pyhanabi.py | ./onpolicy/envs/hanabi/pyhanabi.py | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/pyhanabi.h | ./onpolicy/envs/hanabi/pyhanabi.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/pyhanabi.cc | ./onpolicy/envs/hanabi/pyhanabi.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/util.h | ./onpolicy/envs/hanabi/hanabi_lib/util.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/util.cc | ./onpolicy/envs/hanabi/hanabi_lib/util.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/observation_encoder.h | ./onpolicy/envs/hanabi/hanabi_lib/observation_encoder.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_state.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_state.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_state.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_state.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_state.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_state.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_observation.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_observation.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_observation.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_observation.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_move.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_move.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_move.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_move.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_history_item.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_history_item.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_history_item.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_history_item.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_hand.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_hand.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_hand.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_hand.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_game.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_game.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_game.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_game.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_card.h | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_card.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/hanabi_card.cc | ./onpolicy/envs/hanabi/hanabi_lib/hanabi_card.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/canonical_encoders.h | ./onpolicy/envs/hanabi/hanabi_lib/canonical_encoders.h | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_lib/canonical_encoders.cc | ./onpolicy/envs/hanabi/hanabi_lib/canonical_encoders.cc | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/hanabi_Env.py | ./onpolicy/envs/hanabi/hanabi_Env.py | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/clean_all.sh | ./onpolicy/envs/hanabi/clean_all.sh | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | -| 开源代码引入 | https://github.com/marlbenchmark/on-policy/blob/main/envs/hanabi/__init__.py | ./onpolicy/envs/hanabi/__init__.py | https://www.apache.org/licenses/LICENSE-2.0 | apache.org/licenses引用的公网来源说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------|---------------------|---------| +| ModelZoo-PyTorch/PyTorch/built-in/rl/MAPPO_for_PyTorch/setup.py | zoeyuchao@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/public_address_statement.md b/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/public_address_statement.md index 8c541c924ca0a625e05c23adc1f730515ef62fb0..72f1fd14a40e467d8e70dda7fb266822675e6595 100644 --- a/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/public_address_statement.md @@ -1,55 +1,10 @@ - -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱 | 用途说明 | -|:------:|:-------------------------:|:---------------------------------------------------------------------------------------------:|:--------------------:|:-----------------:| -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/README_ja.md | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/README_ja.md | janhu9527@gmail.com | 参与OpenRLHF贡献联系邮箱 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/README.md | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/README.md | janhu9527@gmail.com | 参与OpenRLHF贡献联系邮箱 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/README_zh.md | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/README_zh.md | janhu9527@gmail.com | 参与OpenRLHF贡献联系邮箱 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/cli/train_dpo.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/cli/train_dpo.py | https://arxiv.org/pdf/2310.12036v2.pdf | IPO的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/cli/train_dpo.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/cli/train_dpo.py | https://arxiv.org/pdf/2305.18290.pdf | DPO的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/cli/train_ppo_ray.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/cli/train_ppo_ray.py | http://joschu.net/blog/kl-approx.html | 近似KL散度的链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/utils.py | http://joschu.net/blog/kl-approx.html | 近似KL散度的链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/model.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/model.py | https://github.com/huggingface/transformers/blob/405b56269812056d9593869e22b7b264d806cb1e/src/transformers/models/llama/modeling_llama.py | transformers三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/actor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/actor.py | https://huggingface.co/docs/transformers/deepspeed#non-trainer-deepspeed-integration | transformers三方仓源码DeepSpeed解释链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/actor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/actor.py | https://github.com/huggingface/peft/issues/137 | peft三方仓issue链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/model.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/model.py | https://huggingface.co/docs/transformers/main_classes/deepspeed#nontrainer-deepspeed-integration | transformers三方仓源码DeepSpeed解释链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/actor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/actor.py | https://github.com/huggingface/transformers/issues/26877 | transformers三方仓issue链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/loss.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/loss.py | https://arxiv.org/abs/2204.05862 | Pairwise Loss for Reward Model论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/model.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/model.py | https://github.com/huggingface/transformers/issues/26877 | transformers三方仓issue链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/loss.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/loss.py | https://arxiv.org/pdf/2310.12036v2.pdf | IPO的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/loss.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/loss.py | https://ericmitchell.ai/cdpo.pdf | cDPO的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/loss.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/loss.py | https://arxiv.org/pdf/2305.18290.pdf | DPO的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/loss.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/loss.py | https://github.com/ContextualAI/HALOs/blob/ca9b7e3eeea220c0944ad8095d641da33f907a7e/trainers.py | HALOs三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/model.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/model.py | https://github.com/OpenRLHF/OpenRLHF/issues/217 | OpenRLHF三方仓issue链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/actor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/actor.py | https://github.com/OpenRLHF/OpenRLHF/issues/217 | OpenRLHF三方仓issue链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/actor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/actor.py | https://github.com/OpenRLHF/OpenRLHF/pull/634 | OpenRLHF三方仓pr链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/models/loss.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/models/loss.py | https://github.com/microsoft/LMOps/blob/main/minillm/finetune.py | LMOps三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ppo_utils/kl_controller.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ppo_utils/kl_controller.py | https://github.com/microsoft/LMOps/blob/main/minillm/finetune.py | Adaptive KL controller的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ppo_utils/experience_maker.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ppo_utils/experience_maker.py | https://github.com/microsoft/LMOps/blob/main/minillm/finetune.py | PPO的论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/nvidia_gpu.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/amd_gpu.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/npu.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/hpu.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/neuron.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/tpu.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/intel_gpu.py | ray三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/vllm_engine.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/vllm_engine.py | https://github.com/vllm-project/vllm/commit/479d69fad0538f04cb22bf13e76ff91cfeb8a4e5 | vllm三方仓commit id链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/vllm_engine.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/vllm_engine.py | https://github.com/vllm-project/vllm/commit/676a99982fe9aabe72fd52a91e08988a653a7359 | vllm三方仓commit id链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/vllm_engine.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/vllm_engine.py | https://github.com/vllm-project/vllm/pull/10555 | vllm三方仓pr链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/vllm_engine.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/vllm_engine.py | https://github.com/vllm-project/vllm/commit/7206ce4ce112ed117796a59045c968a6d353f691 | vllm三方仓commit id链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/vllm_engine.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/vllm_engine.py | https://github.com/vllm-project/vllm/commit/eb6d3c264d0cd8e44dec16bca7947fbe96415ce9#diff-e1ad69e38e033accddfa5480ec808c4740eb39244d1ef51cc3407e20dde8cfd4 | vllm三方仓commit id链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/launcher.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/launcher.py | https://github.com/vllm-project/vllm/commit/eb6d3c264d0cd8e44dec16bca7947fbe96415ce9#diff-e1ad69e38e033accddfa5480ec808c4740eb39244d1ef51cc3407e20dde8cfd4 | custom resources解释链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/trainer/ray/ppo_actor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/trainer/ray/ppo_actor.py | https://github.com/vllm-project/vllm/blob/c6b0a7d3ba03ca414be1174e9bd86a97191b7090/vllm/worker/worker_base.py | vllm三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/logging_utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/logging_utils.py | https://github.com/skypilot-org/skypilot/blob/86dc0f6283a335e4aa37b3c10716f90999f48ab6/sky/sky_logging.py | skypilot三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/logging_utils.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/utils.py | https://github.com/facebookresearch/llama-recipes/pull/196 | llama-cookbook三方仓pr链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/processor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/processor.py | https://arxiv.org/abs/2308.12050 | CA论文链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/distributed_sampler.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/distributed_sampler.py | https://github.com/pytorch/pytorch/blob/5298acb5c76855bc5a99ae10016efc86b27949bd/torch/utils/data/distributed.py | pytorch三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/distributed_sampler.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/distributed_sampler.py | https://github.com/pytorch/pytorch/blob/main/torch/distributed/distributed_c10d.py | pytorch三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/processor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/processor.py | https://arxiv.org/abs/2307.09288 | pytorch三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/distributed_util.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/distributed_util.py | https://github.com/pytorch/pytorch/commit/a0c7029a75628cd5fa8df83c0de0ea98ee7fd844 | pytorch三方仓commit id链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/processor.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/processor.py | https://github.com/RLHFlow/Online-RLHF/blob/main/run_loop.sh | OpenRLHF三方仓源码链接 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/v0.5.7/openrlhf/utils/deepspeed/deepspeed.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/utils/deepspeed/deepspeed.py | https://github.com/RLHFlow/Online-RLHF/blob/main/run_loop.sh | DeepSpeed三方仓issue链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.51.0/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | http://www.apache.org/licenses/LICENSE-2.0 | Apache-2.0 License链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.51.0/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | https://qwenlm.github.io/blog/qwen2-vl/ | qwen2-vl关于旋转位置编码解释链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.51.0/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | torch.nn解释链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.51.0/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | https://github.com/huggingface/transformers/pull/34852 | transformers三方仓pr链接 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/v4.51.0/src/transformers/models/qwen2_vl/modeling_qwen2_vl.py | PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | https://github.com/pytorch/pytorch/issues/110213 | pytorch三方仓issue链接 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/.github/workflows/python-package.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-ubuntu2204.pin | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/.github/workflows/python-package.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/.github/workflows/python-package.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/ | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/examples/scripts/nvidia_docker_install.sh | https://get.docker.com | 下载镜像 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/cli/train_ppo.py | http://joschu.net/blog/kl-approx.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/openrlhf/cli/train_ppo_ray.py | http://joschu.net/blog/kl-approx.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.5.7_for_PyTorch/transformers_need/modeling_qwen2_vl.py | https://arxiv.org/abs/1910.13461 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/public_address_statement.md b/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/public_address_statement.md index ac61e5101d3d8e4e18877d843e822eba5507b23e..e811fcf98106d74f581217d81fa81adf845d82b6 100644 --- a/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/public_address_statement.md @@ -1,54 +1,10 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- |----------------------------------------------------------------------------------------------------|----------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------| -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/cli/train_dpo.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/cli/train_dpo.py | https://arxiv.org/pdf/2310.12036v2.pdf | IPO算法的论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/cli/train_dpo.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/cli/train_dpo.py | https://arxiv.org/pdf/2305.18290.pdf | cDPO算法的论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/utils.py | http://joschu.net/blog/kl-approx.html | 说明计算KL散度的博客地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/utils.py | http://joschu.net/blog/kl-approx.html | 说明计算K2估计量的博客地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/utils.py | https://arxiv.org/pdf/2310.10505 | ReMax论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/utils.py | http://joschu.net/blog/kl-approx.html | 说明计算K3估计量的博客地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/utils.py | https://github.com/OpenRLHF/OpenRLHF/pull/718#issuecomment-2641081881 | OpenRLHF仓库Pull requests地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/actor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/actor.py | https://huggingface.co/docs/transformers/deepspeed#non-trainer-deepspeed-integration | DeepSpeed文档地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/actor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/actor.py | https://github.com/huggingface/peft/issues/137 | peft仓库issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/actor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/actor.py | https://github.com/huggingface/transformers/issues/26877 | transformers仓库issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/actor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/actor.py | https://github.com/OpenRLHF/OpenRLHF/issues/217 | OpenRLHF仓库issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/actor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/actor.py | https://github.com/OpenRLHF/OpenRLHF/pull/634 | OpenRLHF仓库Pull requests地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/model.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/model.py | https://huggingface.co/docs/transformers/main_classes/deepspeed#nontrainer-deepspeed-integration | DeepSpeed文档地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/model.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/model.py | https://github.com/huggingface/transformers/blob/405b56269812056d9593869e22b7b264d806cb1e/src/transformers/models/llama/modeling_llama.py#L1254 | LlamaForSequenceClassification类定义的代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/model.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/model.py | https://github.com/huggingface/transformers/issues/26877 | transformers仓库issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/model.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/model.py | https://github.com/OpenRLHF/OpenRLHF/issues/217 | OpenRLHF仓库issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/model.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/model.py | https://github.com/OpenRLHF/OpenRLHF/issues/217 | OpenRLHF仓库issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://arxiv.org/abs/2204.05862 | RLHF论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://arxiv.org/pdf/2310.12036v2.pdf | loss相关论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://ericmitchell.ai/cdpo.pdf | 说明DPO、IPO关系的文档地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://arxiv.org/pdf/2305.18290.pdf | DPO论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://github.com/ContextualAI/HALOs/blob/ca9b7e3eeea220c0944ad8095d641da33f907a7e/trainers.py#L742 | SimpleKTOTrainer类定义的代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://github.com/ContextualAI/HALOs/blob/ca9b7e3eeea220c0944ad8095d641da33f907a7e/trainers.py#L770 | KTOTrainer类定义的代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/models/loss.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/models/loss.py | https://github.com/microsoft/LMOps/blob/main/minillm/finetune.py#L166 | distil_loss计算的代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ppo_utils/kl_controller.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ppo_utils/kl_controller.py | https://arxiv.org/pdf/1909.08593.pdf | 描述自适应KL控制器的论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ppo_utils/experience_maker.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ppo_utils/experience_maker.py | https://arxiv.org/abs/1707.06347 | PPO论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/nvidia_gpu.py#L95-L96 | ray仓库中与NVIDIA GPU加速器管理相关的代码 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/amd_gpu.py#L102-L103 | ray仓库中与AMD GPU加速器管理相关的代码 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/npu.py#L94-L95 | ray仓库中与NPU加速器管理相关的代码 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/hpu.py#L116-L117 | ray仓库中与HPU加速器管理相关的代码 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/neuron.py#L108-L109 | ray仓库中与Neuron加速器管理相关的代码 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/utils.py | https://github.com/ray-project/ray/blob/161849364a784442cc659fb9780f1a6adee85fce/python/ray/_private/accelerators/tpu.py#L171-L172 | ray仓库中与TPU加速器管理相关的代码 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/launcher.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/launcher.py | https://docs.ray.io/en/latest/ray-core/scheduling/resources.html | ray官方文档地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/trainer/ray/ppo_actor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/trainer/ray/ppo_actor.py | https://github.com/vllm-project/vllm/blob/c6b0a7d3ba03ca414be1174e9bd86a97191b7090/vllm/worker/worker_base.py#L445 | vllm仓的worker_base.py代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/logging_utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/logging_utils.py | https://github.com/skypilot-org/skypilot/blob/86dc0f6283a335e4aa37b3c10716f90999f48ab6/sky/sky_logging.py | skypilot仓库的sky_logging.py代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/utils.py | https://github.com/facebookresearch/llama-recipes/pull/196 | llama-cookbook仓库Pull requests地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/processor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/processor.py | https://arxiv.org/abs/2308.12050 | 离线对齐框架相关的论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/processor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/processor.py | https://arxiv.org/abs/2307.09288 | Llama2论文地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/processor.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/processor.py | https://github.com/RLHFlow/Online-RLHF/blob/main/run_loop.sh | Online-RLHF仓库的run_loop.sh代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/distributed_sampler.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/distributed_sampler.py | https://github.com/pytorch/pytorch/blob/5298acb5c76855bc5a99ae10016efc86b27949bd/torch/utils/data/distributed.py | pytorch仓库的distributed.py代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/distributed_util.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/distributed_util.py | https://github.com/pytorch/pytorch/blob/main/torch/distributed/distributed_c10d.py | pytorch仓库的distributed_c10d.py代码地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/distributed_util.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/distributed_util.py | https://github.com/pytorch/pytorch/commit/a0c7029a75628cd5fa8df83c0de0ea98ee7fd844 | pytorch仓库的Commit a0c7029地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/deepspeed/deepspeed.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/deepspeed/deepspeed.py | https://github.com/Dao-AILab/flash-attention/commit/732654583c2e640adc012ecb60e460bf19dcd9e3 | flash-attention仓库的Commit 7326545地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/deepspeed/deepspeed.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/deepspeed/deepspeed.py | https://github.com/microsoft/DeepSpeed/issues/4295 | DeepSpeed仓库的issue地址 | -| 开源代码引入 | https://github.com/OpenRLHF/OpenRLHF/blob/main/openrlhf/utils/deepspeed/deepspeed_utils.py | OpenRLHF_v0.6.2_for_PyTorch/openrlhf/utils/deepspeed/deepspeed_utils.py | https://github.com/deepspeedai/DeepSpeed/pull/7050 | DeepSpeed仓库的Pull requests地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | torch.nn.Module文档地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_llama.py | https://arxiv.org/abs/1910.13461 | BART论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/llama/modeling_llama.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_llama.py | https://github.com/pytorch/pytorch/issues/110213 | pytorch仓库的issue地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qwen2/modeling_qwen2.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_qwen2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | torch.nn.Module文档地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qwen2/modeling_qwen2.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_qwen2.py | https://arxiv.org/abs/1910.13461 | BART论文地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qwen2/modeling_qwen2.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_qwen2.py | https://huggingface.co/docs/transformers/en/kv_cache | KV cache文档地址 | -| 开源代码引入 | https://github.com/huggingface/transformers/blob/main/src/transformers/models/qwen2/modeling_qwen2.py | PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_qwen2.py | https://github.com/pytorch/pytorch/issues/110213 | pytorch仓库的issue地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/.github/workflows/python-package.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/cuda-ubuntu2204.pin | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/.github/workflows/python-package.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/.github/workflows/python-package.yml | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu2204/x86_64/ | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/examples/scripts/nvidia_docker_install.sh | https://get.docker.com | 下载镜像 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_llama.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_llama.py | https://arxiv.org/abs/1910.13461 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_qwen2.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/OpenRLHF_v0.6.2_for_PyTorch/transformers_need/modeling_qwen2.py | https://arxiv.org/abs/1910.13461 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/built-in/rl/VeRL_for_PyTorch/public_address_statement.md b/PyTorch/built-in/rl/VeRL_for_PyTorch/public_address_statement.md index 6f013861197e4078afa9fc97565e1f108ee9dfe8..d69409282de31c1165b34b3db77fb86fd7d73934 100644 --- a/PyTorch/built-in/rl/VeRL_for_PyTorch/public_address_statement.md +++ b/PyTorch/built-in/rl/VeRL_for_PyTorch/public_address_statement.md @@ -1,160 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------------ |--------------------------------------------------------------------------------------------|-----------------------------------------|----------------------------------------------------------------------|--| -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.github/workflows/dataset.yml | ./.github/workflows/dataset.yml | https://github.com/eric-haibin-lin/verl-data | 数据集链接 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.megatron | ./docker/Dockerfile.megatron | git+https://github.com/NVIDIA/TransformerEngine.git@stable | Dockerfile地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.megatron | ./docker/Dockerfile.megatron | https://github.com/NVIDIA/Megatron-LM.git | Megatron仓地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.megatron | ./docker/Dockerfile.megatron | https://pypi.tuna.tsinghua.edu.cn/simple | 清华源地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm | ./docker/Dockerfile.ngc.vllm | https://download.pytorch.org/whl/cu124 | PyTorch源地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm | ./docker/Dockerfile.ngc.vllm | git+https://github.com/NVIDIA/apex | apex地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm | ./docker/Dockerfile.ngc.vllm | git+https://github.com/eric-haibin-lin/TransformerEngine.git@v1.7.0 | TransformerEngine地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm0.8 | ./docker/Dockerfile.ngc.vllm0.8 | https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/rel-24-08.html | PyTorch release note地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm0.8 | ./docker/Dockerfile.ngc.vllm0.8 | https://mirrors.tuna.tsinghua.edu.cn/ubuntu/ | ubuntu镜像地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm0.8 | ./docker/Dockerfile.ngc.vllm0.8 | https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple | 清华源地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.ngc.vllm0.8 | ./docker/Dockerfile.ngc.vllm0.8 | https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.6cxx11abiFALSE-cp310-cp310-linux_x86_64.whl | flash_attn地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.rocm | ./docker/Dockerfile.rocm | https://github.com/vllm-project/vllm.git | vllm仓地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.sglang | ./docker/Dockerfile.sglang | https://docs.nvidia.com/deeplearning/frameworks/pytorch-release-notes/rel-24-08.html | PyTorch release note地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.sglang | ./docker/Dockerfile.sglang | https://mirrors.ustc.edu.cn/ubuntu/ | ubuntu镜像地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.sglang | ./docker/Dockerfile.sglang | https://mirrors.aliyun.com/pypi/simple/ | 阿里源地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.sglang | ./docker/Dockerfile.sglang | https://flashinfer.ai/whl/cu124/torch2.5/flashinfer-python | flashinfer地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.sglang | ./docker/Dockerfile.sglang | https://ghfast.top/https://github.com/Dao-AILab/flash-attention/releases/download/v2.7.4.post1/flash_attn-2.7.4.post1+cu12torch2.5cxx11abiFALSE-cp310-cp310-linux_x86_64.whl | flash_attn地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.vemlp.vllm.te | ./docker/Dockerfile.vemlp.vllm.te | https://pypi.tuna.tsinghua.edu.cn/simple | 清华源地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.vemlp.vllm.te | ./docker/Dockerfile.vemlp.vllm.te | git+https://github.com/NVIDIA/apex | apex地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.vemlp.vllm.te | ./docker/Dockerfile.vemlp.vllm.te | git+https://github.com/eric-haibin-lin/TransformerEngine.git@v1.7.0 | TransformerEngine地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docker/Dockerfile.vemlp.vllm.te | ./docker/Dockerfile.vemlp.vllm.te | git+https://github.com/NVIDIA/TransformerEngine.git@v1.7 | TransformerEngine地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://github.com/volcengine/verl | verl仓库地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://lightning.ai/hlin-verl/studios/verl-getting-started | verl lighting地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://localhost:8080/ | 启动地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://huggingface.co/datasets/openai/gsm8k | gsm8k数据集地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://verl.readthedocs.io/en/latest/examples/ppo_code_architecture.html | ppo简介地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://verl.readthedocs.io/en/latest/examples/config.html | 配置地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ppo_trainer/verl_getting_started.ipynb | ./examples/ppo_trainer/verl_getting_started.ipynb | https://pytorch.org/docs/stable/distributed.checkpoint.html | checkpoint接口地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/examples/ray/tutorial.ipynb | ./examples/ray/tutorial.ipynb | http://www.w3.org/2000/svg\ | w3地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/patches/megatron_v4.patch | ./patches/megatron_v4.patch | https://github.com/NVIDIA/apex | apex地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.readthedocs.yaml | ./.readthedocs.yaml | https://docs.readthedocs.io/en/stable/config-file/v2.html | config文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.github/workflows/e2e_gsm8k_megatron.yml | ./.github/workflows/e2e_gsm8k_megatron.yml | https://github.com/NVIDIA/Megatron-LM/tree/core_r0.11.0 | Megatron Core地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.github/workflows/scorecard.yml | ./.github/workflows/scorecard.yml | https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection | scorecard地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.github/workflows/scorecard.yml | ./.github/workflows/scorecard.yml | https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained | scorecard地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.github/workflows/scorecard.yml | ./.github/workflows/scorecard.yml | https://github.com/ossf/scorecard-action?tab=readme-ov-file | scorecard地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/.github/workflows/scorecard.yml | ./.github/workflows/scorecard.yml | https://github.com/ossf/scorecard-action#publishing-results | scorecard 结果地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/docs/conf.py | ./docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | sphinx地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/recipe/dapo/src/config/dapo_trainer.yaml | ./recipe/dapo/src/config/dapo_trainer.yaml | https://arxiv.org/pdf/1912.09729 | DAPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/recipe/dapo/src/config/dapo_trainer.yaml | ./recipe/dapo/src/config/dapo_trainer.yaml | https://arxiv.org/abs/2410.21236 | DAPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/recipe/prime/prime_dp_rm.py | ./recipe/prime/prime_dp_rm.py | https://arxiv.org/abs/1707.06347 | PPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/recipe/r1/data_process.py | ./recipe/r1/data_process.py | https://github.com/LiveCodeBench/LiveCodeBench/blob/998c52d394b836f15fff3b9a29866191108ff81b/lcb_runner/prompts/code_generation.py#L140 | LiveCodeBench地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/recipe/r1/tasks/gpqa.py | ./recipe/r1/tasks/gpqa.py | https://github.com/openai/simple-evals/blob/90e3e821cabba2aeb6be651dcb662b253df04225/common.py#L25 | simple-evals地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/tests/e2e/envs/digit_completion/tokenizer.py | ./tests/e2e/envs/digit_completion/tokenizer.py | https://github.com/dariush-bahrami/character-tokenizer/blob/master/charactertokenizer/core.py | character-tokenizer地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/protocol.py | ./verl/blob/main/verl/protocol.py | https://pytorch.org/tensordict/ | tensordict地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/protocol.py | ./verl/blob/main/verl/protocol.py | https://pytorch.org/tensordict/tutorials/data_fashion | data_fashion地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/models/llama/megatron/layers/parallel_linear.py | ./verl/models/llama/megatron/layers/parallel_linear.py | https://github.com/vllm-project/vllm/blob/main/vllm/model_executor/layers/linear.py | vllm中linear文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/models/transformers/qwen2_vl.py | ./verl/models/transformers/qwen2_vl.py | https://github.com/huggingface/transformers/blob/v4.49.0/src/transformers/models/qwen2_5_vl/modeling_qwen2_5_vl.py#L1546 | transformers pr地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/sglang/parallel_state.py | ./verl/third_party/sglang/parallel_state.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/core/parallel_state.py | parallel_state文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/sglang/parallel_state.py | ./verl/third_party/sglang/parallel_state.py | https://discuss.pytorch.org/t/cuda-allocation-lifetime-for-inputs-to-distributed-all-reduce/191573 | distributed-all-reduce地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/__init__.py | ./verl/third_party/vllm/__init__.py | https://github.com/vllm-project/vllm/pull/12071 | vllm pr地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/arg_utils.py | ./verl/third_party/vllm/vllm_v_0_3_1/arg_utils.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/arg_utils.py | vllm arg_utils文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/config.py | ./verl/third_party/vllm/vllm_v_0_3_1/config.py | https://github.com/vllm-project/vllm/blob/main/vllm/config.py | vllm config文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/config.py | ./verl/third_party/vllm/vllm_v_0_3_1/config.py | https://github.com/vllm-project/vllm/issues/2147 | vllm pr地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/llm.py | ./verl/third_party/vllm/vllm_v_0_3_1/llm.py | https://github.com/vllm-project/vllm/blob/main/vllm/entrypoints/llm.py | vllm llm文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/llm_engine_sp.py | ./verl/third_party/vllm/vllm_v_0_3_1/llm_engine_sp.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/llm_engine.py | vllm llm_engine文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/model_loader.py | ./verl/third_party/vllm/vllm_v_0_3_1/model_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/model_loader | vllm model_loader文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/model_runner.py | ./verl/third_party/vllm/vllm_v_0_3_1/model_runner.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/model_runner.py | vllm model_runner文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/parallel_state.py | ./verl/third_party/vllm/vllm_v_0_3_1/parallel_state.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/core/parallel_state.py | vllm parallel_state文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/tokenizer.py | ./verl/third_party/vllm/vllm_v_0_3_1/tokenizer.py | https://github.com/vllm-project/vllm/blob/main/vllm/transformers_utils/tokenizer_group/tokenizer_group.py | tokenizer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_3_1/weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_3_1/worker.py | ./verl/third_party/vllm/vllm_v_0_3_1/worker.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/worker.py | worker文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/arg_utils.py | ./verl/third_party/vllm/vllm_v_0_4_2/arg_utils.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/arg_utils.py | vllm arg_utils文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/config.py | ./verl/third_party/vllm/vllm_v_0_4_2/config.py | https://github.com/vllm-project/vllm/blob/main/vllm/config.py | vllm config文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/dtensor_weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_4_2/dtensor_weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/hf_weight_loader.py | ./verl/third_party/vllm/vllm_v_0_4_2/hf_weight_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/llm.py | ./verl/third_party/vllm/vllm_v_0_4_2/llm.py | https://github.com/vllm-project/vllm/blob/main/vllm/entrypoints/llm.py | vllm llm文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/llm_engine_sp.py | ./verl/third_party/vllm/vllm_v_0_4_2/llm_engine_sp.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/llm_engine.py | vllm llm_engine文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/megatron_weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_4_2/megatron_weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/model_loader.py | ./verl/third_party/vllm/vllm_v_0_4_2/model_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/model_loader | vllm model_loader文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/model_runner.py | ./verl/third_party/vllm/vllm_v_0_4_2/model_runner.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/model_runner.py | vllm model_runner文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/parallel_state.py | ./verl/third_party/vllm/vllm_v_0_4_2/parallel_state.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/core/parallel_state.py | vllm parallel_state文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/spmd_gpu_executor.py | ./verl/third_party/vllm/vllm_v_0_4_2/spmd_gpu_executor.py | https://github.com/vllm-project/vllm/blob/main/vllm/executor/gpu_executor.py | vllm gpu_executor文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/tokenizer.py | ./verl/third_party/vllm/vllm_v_0_4_2/tokenizer.py | https://github.com/vllm-project/vllm/blob/main/vllm/transformers_utils/tokenizer_group/tokenizer_group.py | tokenizer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_4_2/weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_4_2/worker.py | ./verl/third_party/vllm/vllm_v_0_4_2/worker.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/worker.py | worker文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/arg_utils.py | ./verl/third_party/vllm/vllm_v_0_5_4/arg_utils.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/arg_utils.py | vllm arg_utils文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/config.py | ./verl/third_party/vllm/vllm_v_0_5_4/config.py | https://github.com/vllm-project/vllm/blob/main/vllm/config.py | vllm config文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/dtensor_weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_5_4/dtensor_weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/hf_weight_loader.py | ./verl/third_party/vllm/vllm_v_0_5_4/hf_weight_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/llm.py | ./verl/third_party/vllm/vllm_v_0_5_4/llm.py | https://github.com/vllm-project/vllm/blob/main/vllm/entrypoints/llm.py | vllm llm文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/llm_engine_sp.py | ./verl/third_party/vllm/vllm_v_0_5_4/llm_engine_sp.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/llm_engine.py | vllm llm_engine文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/megatron_weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_5_4/megatron_weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/model_loader.py | ./verl/third_party/vllm/vllm_v_0_5_4/model_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/model_loader | vllm model_loader文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/model_runner.py | ./verl/third_party/vllm/vllm_v_0_5_4/model_runner.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/model_runner.py | vllm model_runner文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/parallel_state.py | ./verl/third_party/vllm/vllm_v_0_5_4/parallel_state.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/core/parallel_state.py | vllm parallel_state文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/spmd_gpu_executor.py | ./verl/third_party/vllm/vllm_v_0_5_4/spmd_gpu_executor.py | https://github.com/vllm-project/vllm/blob/main/vllm/executor/gpu_executor.py | vllm gpu_executor文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/tokenizer.py | ./verl/third_party/vllm/vllm_v_0_5_4/tokenizer.py | https://github.com/vllm-project/vllm/blob/main/vllm/transformers_utils/tokenizer_group/tokenizer_group.py | tokenizer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_5_4/weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_5_4/worker.py | ./verl/third_party/vllm/vllm_v_0_5_4/worker.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/worker.py | worker文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/arg_utils.py | ./verl/third_party/vllm/vllm_v_0_6_3/arg_utils.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/arg_utils.py | vllm arg_utils文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/config.py | ./verl/third_party/vllm/vllm_v_0_6_3/config.py | https://github.com/vllm-project/vllm/blob/main/vllm/config.py | vllm config文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/dtensor_weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_6_3/dtensor_weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/hf_weight_loader.py | ./verl/third_party/vllm/vllm_v_0_6_3/hf_weight_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/llm.py | ./verl/third_party/vllm/vllm_v_0_6_3/llm.py | https://github.com/vllm-project/vllm/blob/main/vllm/entrypoints/llm.py | vllm llm文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/llm_engine_sp.py | ./verl/third_party/vllm/vllm_v_0_6_3/llm_engine_sp.py | https://github.com/vllm-project/vllm/blob/main/vllm/engine/llm_engine.py | vllm llm_engine文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/megatron_weight_loaders.py | ./verl/third_party/vllm/vllm_v_0_6_3/megatron_weight_loaders.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/models | vllm models文件夹地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/model_loader.py | ./verl/third_party/vllm/vllm_v_0_6_3/model_loader.py | https://github.com/vllm-project/vllm/tree/main/vllm/model_executor/model_loader | vllm model_loader文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/model_runner.py | ./verl/third_party/vllm/vllm_v_0_6_3/model_runner.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/model_runner.py | vllm model_runner文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/parallel_state.py | ./verl/third_party/vllm/vllm_v_0_6_3/parallel_state.py | https://github.com/NVIDIA/Megatron-LM/blob/main/megatron/core/parallel_state.py | vllm parallel_state文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/spmd_gpu_executor.py | ./verl/third_party/vllm/vllm_v_0_6_3/spmd_gpu_executor.py | https://github.com/vllm-project/vllm/blob/main/vllm/executor/gpu_executor.py | vllm gpu_executor文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/tokenizer.py | ./verl/third_party/vllm/vllm_v_0_6_3/tokenizer.py | https://github.com/vllm-project/vllm/blob/main/vllm/transformers_utils/tokenizer_group/tokenizer_group.py | tokenizer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/third_party/vllm/vllm_v_0_6_3/worker.py | ./verl/third_party/vllm/vllm_v_0_6_3/worker.py | https://github.com/vllm-project/vllm/blob/main/vllm/worker/worker.py | worker文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/config/generation.yaml | ./verl/trainer/config/generation.yaml | https://arxiv.org/abs/2410.21236 | use_fire_sampling论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/config/ppo_megatron_trainer.yaml | ./verl/trainer/config/ppo_megatron_trainer.yaml | https://arxiv.org/pdf/1912.09729 | DAPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/config/ppo_trainer.yaml | ./verl/trainer/config/ppo_trainer.yaml | https://arxiv.org/pdf/1912.09729 | DAPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/config/ppo_trainer.yaml | ./verl/trainer/config/ppo_trainer.yaml | https://arxiv.org/abs/2410.21236 | use_fire_sampling论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/pdf/1909.08593.pdf | kl controller地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://github.com/huggingface/trl/blob/main/trl/trainer/ppo_trainer.py | ppo_trainer地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/abs/1506.02438 | 计算Advantage论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/abs/2501.03262 | 计算Advantage论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/abs/2402.14740 | 计算Advantage论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/abs/2310.10505 | 计算Advantage论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://github.com/huggingface/trl/blob/main/trl/trainer/ppo_trainer.py#L1122 | ppo_trainer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/abs/1707.06347 | ppo clip文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://arxiv.org/pdf/1912.09729 | ppo论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://github.com/huggingface/trl/blob/main/trl/trainer/ppo_trainer.py#L1151 | ppo_trainer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | https://github.com/huggingface/trl/blob/main/trl/trainer/ppo_trainer.py#L1104 | ppo_trainer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/core_algos.py | ./verl/trainer/ppo/core_algos.py | http://joschu.net/blog/kl-approx.html | kl论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/ray_trainer.py | ./verl/trainer/ppo/ray_trainer.py | https://github.com/huggingface/trl/blob/951ca1841f29114b969b57b26c7d3e80a39f75a0/trl/trainer/ppo_trainer.py#L837 | ppo_trainer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/ray_trainer.py | ./verl/trainer/ppo/ray_trainer.py | https://github.com/volcengine/verl/blob/master/examples/ray/tutorial.ipynb | ray example地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/trainer/ppo/ray_trainer.py | ./verl/trainer/ppo/ray_trainer.py | https://github.com/ray-project/ray/pull/45699 | ray pr地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/fsdp_utils.py | ./verl/utils/fsdp_utils.py | https://github.com/huggingface/transformers/src/transformers/trainer.py | trainer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/seqlen_balancing.py | ./verl/utils/seqlen_balancing.py | https://en.wikipedia.org/wiki/Largest_differencing_method | karmarkar_karp计算方式 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/tokenizer.py | ./verl/utils/tokenizer.py | https://huggingface.co/google/gemma-2-2b-it/commit/17a01657f5c87135bcdd0ec7abb4b2dece04408a | gemma-2地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/tokenizer.py | ./verl/utils/tokenizer.py | https://github.com/huggingface/transformers/blob/v4.49.0/src/transformers/models/auto/processing_auto.py#L344 | processing_auto文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/torch_functional.py | ./verl/utils/torch_functional.py | https://github.com/pytorch/pytorch/issues/563#issuecomment-330103591 | pytorch issue文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/torch_functional.py | ./verl/utils/torch_functional.py | https://github.com/pytorch/pytorch/issues/2793#issuecomment-428784713 | pytorch issue文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/tracking.py | ./verl/utils/tracking.py | https://github.com/wandb/wandb/issues/2981#issuecomment-1997445737 | wandb issue文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/tracking.py | ./verl/utils/tracking.py | https://mlflow.org/docs/latest/api_reference/python_api/mlflow.html?highlight=log_artifact#mlflow.log_artifact | mlflow地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/ulysses.py | ./verl/utils/ulysses.py | https://arxiv.org/abs/2309.14509 | deepspeed ulysses论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/ulysses.py | ./verl/utils/ulysses.py | https://github.com/microsoft/DeepSpeed/blob/master/deepspeed/sequence/layer.py | layer文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/__init__.py | ./verl/utils/reward_score/__init__.py | https://github.com/huggingface/Math-Verify | Math-Verify地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/math.py | ./verl/utils/reward_score/math.py | https://github.com/EleutherAI/lm-evaluation-harness/blob/main/lm_eval/tasks/hendrycks_math/utils.py | utils地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/math.py | ./verl/utils/reward_score/math.py | https://github.com/EleutherAI/lm-evaluation-harness/blob/master/lm_eval/tasks/hendrycks_math.py | hendrycks_math地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/math_dapo.py | ./verl/utils/reward_score/math_dapo.py | https://github.com/EleutherAI/lm-evaluation-harness/blob/main/lm_eval/tasks/hendrycks_math/utils.py | utils地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_code/testing_util.py | ./verl/utils/reward_score/prime_code/testing_util.py | https://stackoverflow.com/a/16571630/6416660 | stackoverflow地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_code/utils.py | ./verl/utils/reward_score/prime_code/utils.py | https://huggingface.co/spaces/codeparrot/apps_metric/blob/main/utils.py | stackoverflow地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_math/__init__.py | ./verl/utils/reward_score/prime_math/__init__.py | https://github.com/openai/prm800k/blob/main/prm800k/grading/grader.py | grader文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_math/grader.py | ./verl/utils/reward_score/prime_math/grader.py | https://github.com/microsoft/ToRA/blob/main/src/eval/grader.py | grader文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_math/grader.py | ./verl/utils/reward_score/prime_math/grader.py | https://github.com/microsoft/ProphetNet/tree/master/CRITIC | ProphetNet地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_math/grader.py | ./verl/utils/reward_score/prime_math/grader.py | https://github.com/openai/prm800k | prm800k地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/utils/reward_score/prime_math/math_normalize.py | ./verl/utils/reward_score/prime_math/math_normalize.py | https://github.com/openai/prm800k/blob/main/prm800k/grading/math_normalize.py | math_normalize文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/fsdp_workers.py | ./verl/workers/fsdp_workers.py | https://github.com/sgl-project/sglang/blob/00f42707eaddfc2c0528e5b1e0094025c640b7a0/python/sglang/srt/layers/quantization/fp8_utils.py#L76 | fp8_utils文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/fsdp_workers.py | ./verl/workers/fsdp_workers.py | https://pytorch.org/docs/stable/notes/fsdp.html#fsdp-notes | fsdp.html文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/megatron_workers.py | ./verl/workers/megatron_workers.py | https://github.com/pytorch/pytorch/issues/89492 | pytorch issue地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/megatron_workers.py | ./verl/workers/megatron_workers.py | https://github.com/ray-project/ray/pull/44385 | ray pr地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/actor/base.py | ./verl/workers/actor/base.py | https://omegaconf.readthedocs.io/ | omegaconf地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/actor/dp_actor.py | ./verl/workers/actor/dp_actor.py | https://arxiv.org/abs/1707.06347 | PPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/actor/megatron_actor.py | ./verl/workers/actor/megatron_actor.py | https://arxiv.org/abs/1707.06347 | PPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/actor/megatron_actor.py | ./verl/workers/actor/megatron_actor.py | https://arxiv.org/pdf/2104.04473.pdf | 论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/actor/megatron_actor.py | ./verl/workers/actor/megatron_actor.py | https://github.com/Dao-AILab/flash-attention/blob/main/flash_attn/bert_padding.py | bert_padding地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/critic/dp_critic.py | ./verl/workers/critic/dp_critic.py | https://arxiv.org/abs/1707.06347 | PPO论文地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/reward_manager/prime.py | ./verl/workers/reward_manager/prime.py | https://github.com/PRIME-RL/PRIME | PRIME地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/rollout/hf_rollout.py | ./verl/workers/rollout/hf_rollout.py | https://github.com/pytorch/pytorch/issues/100069 | pytorch issue地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/rollout/vllm_rollout/vllm_rollout_spmd.py | ./verl/workers/rollout/vllm_rollout/vllm_rollout_spmd.py | https://github.com/volcengine/verl/pull/772 | verl仓地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/sharding_manager/__init__.py | ./verl/workers/sharding_manager/__init__.py | https://github.com/sgl-project/sglang/blob/00f42707eaddfc2c0528e5b1e0094025c640b7a0/python/sglang/srt/layers/quantization/fp8_utils.py#L76 | fp8_utils文件地址 | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/sharding_manager/fsdp_vllm.py | ./verl/workers/sharding_manager/fsdp_vllm.py | https://pytorch.org/docs/stable/notes/cuda.html#memory-management | pytorch内存管理readme | -| 开源代码引入 | https://github.com/volcengine/verl/blob/main/verl/workers/sharding_manager/fsdp_vllm.py | ./verl/workers/sharding_manager/fsdp_vllm.py | https://github.com/vllm-project/vllm/blob/v0.7.3/vllm/device_allocator/cumem.py#L103 | cumem文件地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|---------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/built-in/rl/VeRL_for_PyTorch/docker/Dockerfile.ngc.vllm | https://download.pytorch.org/whl/cu124 | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/VeRL_for_PyTorch/docker/Dockerfile.ngc.vllm0.8 | https://mirrors.tuna.tsinghua.edu.cn/pypi/web/simple | 清华源地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/VeRL_for_PyTorch/docker/Dockerfile.ngc.vllm0.8 | https://mirrors.tuna.tsinghua.edu.cn/ubuntu/ | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/VeRL_for_PyTorch/docker/Dockerfile.sglang | https://mirrors.aliyun.com/pypi/simple/ | 阿里源地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/VeRL_for_PyTorch/docker/Dockerfile.sglang | https://mirrors.ustc.edu.cn/ubuntu/ | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/built-in/rl/VeRL_for_PyTorch/setup.py | zhangchi.usc1992@bytedance.com","gmsheng@connect.hku.hk | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/audio/FastPitch/public_address_statement.md b/PyTorch/contrib/audio/FastPitch/public_address_statement.md index 7e585fba7102efbb2aea2a9fe7285dd7d66f7461..cb2c1f6eed955fdc1ab62592b11ee6b4612127f6 100644 --- a/PyTorch/contrib/audio/FastPitch/public_address_statement.md +++ b/PyTorch/contrib/audio/FastPitch/public_address_statement.md @@ -1,21 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://github.com/Alexir/CMUdict/raw/master/cmudict-0.7b | 下载三方库 | -| 开发引入 | / | url.ini | http://data.keithito.com/data/speech/ | 下载数据集 | -| 开发引入 | / | url.ini | https://api.ngc.nvidia.com/v2/models/nvidia/fastpitch_pyt_amp_ckpt_v1_1/versions/21.05.0/zip | 下载权重文件 | -| 开发引入 | / | url.ini | https://api.ngc.nvidia.com/v2/models/nvidia/waveglow_ckpt_amp_256/versions/20.01.0/zip | 下载权重文件 | -| 开发引入 | / | url.ini | https://api.ngc.nvidia.com/v2/models/nvidia/fastpitch_pyt_amp_ckpt_v1/versions/20.02.0/zip | 下载权重文件 | -| 开发引入 | / | url.ini | https://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/triton/convert_model.py | FastPitch/triton/convert_model.py | https://www.tensorflow.org/guide/gpu#limiting_gpu_memory_growth | tensorflow.org源码公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/triton/config_model_on_triton.py | FastPitch/triton/config_model_on_triton.py | https://github.com/triton-inference-server/server/blob/master/docs/model_configuration.md#dynamic-batcher | model_configuration.md在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/triton/config_model_on_triton.py | FastPitch/triton/config_model_on_triton.py | https://github.com/triton-inference-server/server/blob/master/docs/model_configuration.md#auto-generated-model-configuration | model_configuration.md在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/text_processing.py | FastPitch/common/text/text_processing.py | https://github.com/keithito/tacotron | tacotron在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/symbols.py | FastPitch/common/text/symbols.py | https://github.com/keithito/tacotron | tacotron在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/numerical.py | FastPitch/common/text/numerical.py | https://github.com/keithito/tacotron | tacotron在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/numerical.py | FastPitch/common/text/numerical.py | https://stackoverflow.com/questions/19308177/converting-roman-numerals-to-integers-in-python | converting-roman-numerals-to-integers-in-python在stackoverflow.com网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/cmudict.py | FastPitch/common/text/cmudict.py | https://github.com/keithito/tacotron | tacotron在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/cmudict.py | FastPitch/common/text/cmudict.py | http://www.speech.cs.cmu.edu/cgi-bin/cmudict | tacotron在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/cleaners.py | FastPitch/common/text/cleaners.py | https://github.com/keithito/tacotron | tacotron在github网页上的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/text/cleaners.py | FastPitch/common/text/cleaners.py | https://pypi.python.org/pypi/Unidecode | Unidecode在pypi.python.org的公网来源说明 | -| 开源代码引入 | https://https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/FastPitch/common/stft.py | FastPitch/common/stft.py | https://github.com/pseeth/pytorch-stft | pytorch-stft在github.com上的公网来源说明 | -| 开发引入 | / | FastPitch/triton/requirements.txt | https://github.com/triton-inference-server/model_navigator.git@v0.2.1#egg=model_navigator | 相关依赖 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------|----------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/audio/FastPitch/url.ini | https://api.ngc.nvidia.com/v2/models/nvidia/waveglow_ckpt_amp_256/versions/20.01.0/zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/FastPitch/url.ini | http://data.keithito.com/data/speech/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/FastPitch/url.ini | https://data.keithito.com/data/speech/LJSpeech-1.1.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/FastPitch/url.ini | https://api.ngc.nvidia.com/v2/models/nvidia/fastpitch_pyt_amp_ckpt_v1_1/versions/21.05.0/zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/FastPitch/url.ini | https://api.ngc.nvidia.com/v2/models/nvidia/fastpitch_pyt_amp_ckpt_v1/versions/20.02.0/zip | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/audio/Jasper/public_address_statement.md b/PyTorch/contrib/audio/Jasper/public_address_statement.md index dd0074bb9a002eb8788b4a1c1bced70ee22cb8f7..470d65db3109b6eeb96c138260ae449d0f1d2e06 100644 --- a/PyTorch/contrib/audio/Jasper/public_address_statement.md +++ b/PyTorch/contrib/audio/Jasper/public_address_statement.md @@ -1,30 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|-------| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/Dockerfile | Jasper/Dockerfile | https://developer.download.nvidia.com/compute/redist | 下载三方库 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/triton/scripts/run_perf_client.sh | Jasper/triton/scripts/run_perf_client.sh | http://${SERVER_HOSTNAME}:8000/api/status/${MODEL_NAME} | 检查服务器 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/triton/triton_librispeech.csv | Jasper/triton/triton_librispeech.csv | http://www.openslr.org/resources/12/test-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/inference_librispeech.csv | Jasper/utils/inference_librispeech.csv | http://www.openslr.org/resources/12/dev-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/inference_librispeech.csv | Jasper/utils/inference_librispeech.csv | http://www.openslr.org/resources/12/dev-other.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/inference_librispeech.csv | Jasper/utils/inference_librispeech.csv | http://www.openslr.org/resources/12/test-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/inference_librispeech.csv | Jasper/utils/inference_librispeech.csv | http://www.openslr.org/resources/12/test-other.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/dev-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/dev-other.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/test-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/test-other.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/train-clean-100.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/train-clean-360.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/utils/librispeech.csv | Jasper/utils/librispeech.csv | http://www.openslr.org/resources/12/train-other-500.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/npu_fused_adamw.py | Jasper/npu_fused_adamw.py | https://arxiv.org/abs/1412.6980 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/npu_fused_adamw.py | Jasper/npu_fused_adamw.py | https://arxiv.org/abs/1711.05101 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/npu_fused_adamw.py | Jasper/npu_fused_adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | openreview.net公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/jasper/model.py | Jasper/jasper/model.py | https://arxiv.org/pdf/1904.03288.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/text/symbols.py | Jasper/common/text/symbols.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/text/numbers.py | Jasper/common/text/numbers.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/text/cleaners.py | Jasper/common/text/cleaners.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/text/cleaners.py | Jasper/common/text/cleaners.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/text/__init__.py | Jasper/common/text/__init__.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py | Jasper/common/optimizers.py | https://arxiv.org/abs/1412.6980 | 参考论文地址公网地址来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py | Jasper/common/optimizers.py | https://openreview.net/forum?id=ryQu7f-RZ | openreview.net公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py | Jasper/common/features.py | https://arxiv.org/abs/1904.08779 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py | Jasper/common/features.py | https://arxiv.org/abs/1904.08779 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper/common/optimizers.py | Jasper/common/features.py | https://arxiv.org/pdf/1708.04552.pdf | 参考论文地址公网来源说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/audio/Jasper/Dockerfile | https://developer.download.nvidia.com/compute/redist nvidia-dali-cuda110==1.2.0 | 三方库连接 | \ No newline at end of file diff --git a/PyTorch/contrib/audio/Tacotron2_for_PyTorch/public_address_statement.md b/PyTorch/contrib/audio/Tacotron2_for_PyTorch/public_address_statement.md index 93f93fa90da6e957042f288df83618ca9f4d61c0..7989412b0d8d33d3b56676ad44f03bbb7caa52e8 100644 --- a/PyTorch/contrib/audio/Tacotron2_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/audio/Tacotron2_for_PyTorch/public_address_statement.md @@ -1,12 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|-------| -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/scripts/prepare_dataset.sh | Tacotron2_for_PyTorch/scripts/prepare_dataset.sh | http://data.keithito.com/data/speech/ | 下载数据集 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2_common/stft.py | Tacotron2_for_PyTorch/tacotron2_common/stft.py | https://github.com/pseeth/pytorch-stft | STFT的Module类说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/symbols.py | Tacotron2_for_PyTorch/tacotron2/text/symbols.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/numbers.py | Tacotron2_for_PyTorch/tacotron2/text/numbers.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/cmudict.py | Tacotron2_for_PyTorch/tacotron2/text/cmudict.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/cmudict.py | Tacotron2_for_PyTorch/tacotron2/text/cmudict.py | http://www.speech.cs.cmu.edu/cgi-bin/cmudict | cmudict源码在www.speech.cs.cmu.edu上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/cleaners.py | Tacotron2_for_PyTorch/tacotron2/text/cleaners.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/cleaners.py | Tacotron2_for_PyTorch/tacotron2/text/cleaners.py | https://pypi.python.org/pypi/Unidecode | Unidecode在pypi.python.org上的公网来源说明 | -| 开源代码引入 | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/text/__init__.py | Tacotron2_for_PyTorch/tacotron2/text/__init__.py | https://github.com/keithito/tacotron | tacotron在github上的公网来源说明 | -| 开发引入 | / | Tacotron2_for_PyTorch/requirements.txt | https://github.com/NVIDIA/dllogger | 相关依赖 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|--------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/audio/Tacotron2_for_PyTorch/scripts/prepare_dataset.sh | http://data.keithito.com/data/speech/$BZ2ARCHIVE | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/audio/deepspeech/public_address_statement.md b/PyTorch/contrib/audio/deepspeech/public_address_statement.md index f431fd506ed6cc838dcbbc52ae62c9bbecf39f58..ef327001ac190baf9525946b371718e459c4d6f2 100644 --- a/PyTorch/contrib/audio/deepspeech/public_address_statement.md +++ b/PyTorch/contrib/audio/deepspeech/public_address_statement.md @@ -1,28 +1,12 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|-------| -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/an4.py | deepspeech/data/an4.py | https://github.com/SeanNaren/deepspeech.pytorch/releases/download/V3.0/an4.tar.gz | 下载数据集 | -| 开发引入 | / | url.ini | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/train-clean-100.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/train-clean-360.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/train-other-500.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/dev-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/dev-other.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/test-clean.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/librispeech.py | deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/test-other.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/ted.py | deepspeech/data/ted.py | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/data/voxforge.py | deepspeech/data/voxforge.py | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit/ | 下载数据集 | -| 开发引入 | / | url.ini | https://github.com/SeanNaren/warp-ctc.git | 下载三方库 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/Dockerfile | deepspeech/Dockerfile | https://github.com/parlance/ctcdecode.git | 下载三方库 | -| 开发引入 | / | url.ini | https://github.com/NVIDIA/apex.git | 下载三方库 | -| 开发引入 | / | url.ini | https://github.com/SeanNaren/deepspeech.pytorch/releases/latest/download/an4_pretrained_v2.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/SeanNaren/deepspeech.pytorch/releases/latest/download/librispeech_pretrained_v2.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/SeanNaren/deepspeech.pytorch/releases/latest/download/ted_pretrained_v2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/tests/pretrained_smoke_test.py | deepspeech/tests/pretrained_smoke_test.py | http://www.openslr.org/resources/11/3-gram.pruned.3e-7.arpa.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/loader/sparse_image_warp.py | deepspeech/deepspeech_pytorch/loader/spec_augment.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/loader/sparse_image_warp.py | deepspeech/deepspeech_pytorch/loader/sparse_image_warp.py | https://en.wikipedia.org/wiki/Polyharmonic_spline | Polyharmonic_spline在en.wikipedia.org网页的公网来源说明 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/loader/sparse_image_warp.py | deepspeech/deepspeech_pytorch/loader/sparse_image_warp.py | https://en.wikipedia.org/wiki/Polyharmonic_spline | Polyharmonic_spline在en.wikipedia.org网页的公网来源说明 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/loader/data_loader.py | deepspeech/deepspeech_pytorch/loader/data_loader.py | https://github.com/willfrey/audio/blob/master/torchaudio/transforms.py | transforms.py在github.com/willfrey上的公网来源说明 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/configs/train_config.py | deepspeech/deepspeech_pytorch/configs/train_config.py | https://github.com/willfrey/audio/blob/master/torchaudio/transforms.py | transforms.py在github.com/willfrey上的公网来源说明 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/bidirectional_lstm.py | deepspeech/deepspeech_pytorch/bidirectional_lstm.py | https://ieeexplore.ieee.org/document/650093 | ieeexplore.ieee.org公网来源说明 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/data/an4.py | deepspeech/deepspeech_pytorch/data/an4.py | http://www.speech.cs.cmu.edu/databases/an4/an4_raw.bigendian.tar.gz | 下载数据集 | -| 开源代码引入 | https://github.com/SeanNaren/deepspeech.pytorch/blob/master/deepspeech_pytorch/benchmark.py | deepspeech/deepspeech_pytorch/benchmark.py | https://nvidia.github.io/apex/amp.html for more information | nvidia.github.io信息引用的公网来源说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------|------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/dev-clean.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/train-clean-100.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/test-other.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/test-clean.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/dev-other.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/train-other-500.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/librispeech.py | http://www.openslr.org/resources/12/train-clean-360.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/ted.py | http://www.openslr.org/resources/19/TEDLIUM_release2.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/data/voxforge.py | http://www.repository.voxforge1.org/downloads/SpeechCorpus/Trunk/Audio/Main/16kHz_16bit/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/deepspeech/url.ini | https://common-voice-data-download.s3.amazonaws.com/cv_corpus_v1.tar.gz | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/audio/tdnn/public_address_statement.md b/PyTorch/contrib/audio/tdnn/public_address_statement.md index c2d4d03e1654c464864542bc41bcb223a530c4c1..e32748f196cdc671a7b3d7c27c859fa987dd5f90 100644 --- a/PyTorch/contrib/audio/tdnn/public_address_statement.md +++ b/PyTorch/contrib/audio/tdnn/public_address_statement.md @@ -1,153 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开源代码引入 | https://github.com/speechbrain/speechbrain/setup.py|tdnn/setup.py | https://github.com/pypa/pip/issues/7953 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/setup.py|tdnn/setup.py | speechbrain@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/docs/coverage.md|tdnn/setup.py | https://speechbrain.github.io/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/transformer/Transformer.py|tdnn/speechbrain/lobes/models/transformer/Transformer.py | https://arxiv.org/pdf/1706.03762.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/transformer/Transformer.py|tdnn/speechbrain/lobes/models/transformer/TransformerASR.py | https://arxiv.org/pdf/1706.03762.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/transformer/Transformer.py|tdnn/speechbrain/lobes/models/transformer/TransformerLM.py | https://arxiv.org/pdf/1706.03762.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/transformer/Transformer.py|tdnn/speechbrain/lobes/models/transformer/TransformerST.py | https://arxiv.org/pdf/1706.03762.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/alignment/ctc_segmentation.py|tdnn/speechbrain/alignment/ctc_segmentation.py | https://github.com/lumaku/ctc-segmentation | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/alignment/ctc_segmentation.py|tdnn/speechbrain/alignment/ctc_segmentation.py | https://arxiv.org/abs/2007.09127 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/alignment/ctc_segmentation.py|tdnn/speechbrain/alignment/ctc_segmentation.py | https://github.com/lumaku/ctc-segmentation | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/dataio.py|tdnn/speechbrain/dataio/dataio.py | https://github.com/speechbrain/speechbrain/issues/1925 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/dataio.py|tdnn/speechbrain/dataio/dataio.py | https://github.com/pytorch/audio/issues/2524 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/dataio.py|tdnn/speechbrain/dataio/dataio.py | https://discuss.pytorch.org/t/how-to-generate-variable-length-mask/23397/3 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/dataio.py|tdnn/speechbrain/dataio/dataio.py | https://github.com/vesis84/kaldi-io-for-python | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/decoders/ctc.py|tdnn/speechbrain/decoders/ctc.py | https://www.merl.com/publications/docs/TR2017-190.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/decoders/ctc.py|tdnn/speechbrain/decoders/ctc.py | https://github.com/espnet/espnet/blob/master/espnet/nets/ctc_prefix_score.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/sampler.py|tdnn/speechbrain/dataio/sampler.py | https://www.tensorflow.org/api_docs/python/tf/data/experimental/bucket_by_sequence_length | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/ASR/transformer/hparams/train_ar_hf_whisper.yaml|tdnn/speechbrain/decoders/seq2seq.py | https://cdn.openai.com/papers/whisper.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/decoders/transducer.py|tdnn/speechbrain/decoders/transducer.py | https://arxiv.org/pdf/1911.01629.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/decoders/transducer.py|tdnn/speechbrain/decoders/transducer.py | https://arxiv.org/pdf/1911.01629.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/decoders/transducer.py|tdnn/speechbrain/decoders/transducer.py | https://github.com/kaldi-asr/kaldi/blob/master/src/decoder/simple-decoder.cc | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/ASR/transducer/hparams/train_fr.yaml|tdnn/speechbrain/decoders/seq2seq.py | https://arxiv.org/abs/1904.02619 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/ASR/transducer/hparams/train_fr.yaml|tdnn/speechbrain/decoders/seq2seq.py | https://arxiv.org/abs/1904.02619 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/sampler.py|tdnn/speechbrain/dataio/sampler.py | https://github.com/catalyst-team/catalyst/blob/51428d7756e62b9b8ee5379f38e9fd576eeb36e5/catalyst/data/sampler.py#L522 | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lm/counting.py|tdnn/speechbrain/lm/counting.py | https://github.com/nltk/nltk | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/LibriParty/generate_dataset/download_required_data.py|tdnn/speechbrain/lobes/augment.py | http://www.openslr.org/resources/28/rirs_noises.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/augment.py|tdnn/speechbrain/lobes/augment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/ContextNet.py|tdnn/speechbrain/nnet/activations.py | https://arxiv.org/pdf/2005.03191.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/attention.py|tdnn/speechbrain/nnet/attention.py | https://arxiv.org/pdf/1409.0473.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/features.py|tdnn/speechbrain/lobes/features.py | https://arxiv.org/abs/2101.08596 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/attention.py|tdnn/speechbrain/nnet/attention.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/CNN.py|tdnn/speechbrain/nnet/CNN.py | https://arxiv.org/abs/1808.00158 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/ASR/transformer/hparams/train_ar_hf_whisper.yaml|tdnn/speechbrain/decoders/seq2seq.py | https://cdn.openai.com/papers/whisper.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/attention.py|tdnn/speechbrain/nnet/attention.py | https://arxiv.org/pdf/1901.02860.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/attention.py|tdnn/speechbrain/nnet/attention.py | https://pytorch.org/docs/stable/nn.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/attention.py|tdnn/speechbrain/nnet/attention.py | https://github.com/pytorch/pytorch/blob/5288d05cfdda85c46c4df84617fa7f37c21b10b3/torch/nn/functional.py#L4946 | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/normalization.py|tdnn/speechbrain/nnet/normalization.py | https://arxiv.org/abs/1607.05666 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/features.py|tdnn/speechbrain/nnet/normalization.py | https://arxiv.org/abs/2101.08596 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/features.py|tdnn/speechbrain/nnet/CNN.py | https://arxiv.org/abs/2101.08596 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/losses.py|tdnn/speechbrain/nnet/losses.py | https://arxiv.org/abs/1701.06548 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/features.py|tdnn/speechbrain/nnet/pooling.py | https://arxiv.org/abs/2101.08596 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/CNN.py|tdnn/speechbrain/nnet/CNN.py | https://github.com/google-research/leaf-audio | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/CNN.py|tdnn/speechbrain/nnet/pooling.py | https://github.com/google-research/leaf-audio | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/losses.py|tdnn/speechbrain/nnet/losses.py | https://arxiv.org/abs/1906.07317 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/losses.py|tdnn/speechbrain/nnet/losses.py | https://arxiv.org/abs/1906.07317 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/TIMIT/ASR/seq2seq_knowledge_distillation/README.md|tdnn/speechbrain/nnet/losses.py | https://arxiv.org/abs/2005.09310 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/TIMIT/ASR/seq2seq_knowledge_distillation/README.md|tdnn/speechbrain/nnet/losses.py | https://arxiv.org/abs/2005.09310 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/nnet/losses.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/schedulers.py|tdnn/speechbrain/nnet/schedulers.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/schedulers.py|tdnn/speechbrain/nnet/schedulers.py | https://openreview.net/pdf?id=BJYwwY9ll | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/schedulers.py|tdnn/speechbrain/nnet/schedulers.py | https://arxiv.org/abs/1506.01186 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/complex_networks/c_RNN.py|tdnn/speechbrain/nnet/RNN.py | https://arxiv.org/abs/1803.10225 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/RNN.py|tdnn/speechbrain/nnet/RNN.py | https://github.com/Adel-Moumen/fast_ligru | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/diarization.py|tdnn/speechbrain/processing/diarization.py | https://doi.org/10.1007/s11222-007-9033-z | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/diarization.py|tdnn/speechbrain/processing/diarization.py | https://github.com/scikit-learn/scikit-learn/blob/0fb307bf3/sklearn/cluster/_spectral.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/diarization.py|tdnn/speechbrain/processing/diarization.py | https://github.com/tango4j/Auto-Tuning-Spectral-Clustering/blob/master/spectral_opt.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/schedulers.py|tdnn/speechbrain/nnet/schedulers.py | https://arxiv.org/pdf/1910.10683.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/dataio/dataio.py|tdnn/speechbrain/processing/features.py | https://github.com/pytorch/audio | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/RNN.py|tdnn/speechbrain/nnet/RNN.py | https://arxiv.org/abs/2302.10144 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/RNN.py|tdnn/speechbrain/nnet/RNN.py | https://github.com/Adel-Moumen/fast_ligru | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/diarization.py|tdnn/speechbrain/processing/diarization.py | https://github.com/scikit-learn/scikit-learn/blob/0fb307bf3/sklearn/cluster/_spectral.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/diarization.py|tdnn/speechbrain/processing/diarization.py | https://doi.org/10.1007/s11222-007-9033-z | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/RNN.py|tdnn/speechbrain/nnet/RNN.py | https://arxiv.org/pdf/1611.01576.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/RNN.py|tdnn/speechbrain/nnet/RNN.py | https://github.com/salesforce/pytorch-qrnn | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/signal_processing.py|tdnn/speechbrain/processing/signal_processing.py | https://tomroelandts.com/articles/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/multi_mic.py|tdnn/speechbrain/processing/multi_mic.py | https://www.researchgate.net/publication/221491705_Speaker_localization_based_on_oriented_global_coherence_field | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/signal_processing.py|tdnn/speechbrain/processing/signal_processing.py | https://github.com/kaituoxu/Conv-TasNet/blob/master/src/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/signal_processing.py|tdnn/speechbrain/processing/signal_processing.py | https://github.com/tensorflow/tensorflow/blob/r1.12/tensorflow/contrib/signal/python/ops/reconstruction_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/features.py|tdnn/speechbrain/processing/signal_processing.py | https://arxiv.org/abs/2101.08596 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/features.py|tdnn/speechbrain/processing/signal_processing.py | https://arxiv.org/abs/2101.08596 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/AISHELL-1/Tokenizer/README.md|tdnn/speechbrain/tokenizers/SentencePiece.py | https://github.com/google/sentencepiece | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/tokenizers/SentencePiece.py|tdnn/speechbrain/tokenizers/SentencePiece.py | https://www.aclweb.org/anthology/P16-1162/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/tokenizers/SentencePiece.py|tdnn/speechbrain/tokenizers/SentencePiece.py | https://arxiv.org/abs/1804.10959 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/bleu.py|tdnn/speechbrain/utils/bleu.py | https://www.aclweb.org/anthology/P02-1040.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/bleu.py|tdnn/speechbrain/utils/bleu.py | https://pypi.org/project/sacrebleu/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/speech_augmentation.py|tdnn/speechbrain/processing/speech_augmentation.py | https://pytorch.org/audio/stable/tutorials/audio_resampling_tutorial.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/speech_augmentation.py|tdnn/speechbrain/processing/speech_augmentation.py | https://ccrma.stanford.edu/~jos/resample/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/speech_augmentation.py|tdnn/speechbrain/processing/speech_augmentation.py | https://github.com/kaldi-asr/kaldi/blob/master/src/feat/resample.h#L56 | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/processing/speech_augmentation.py|tdnn/speechbrain/processing/speech_augmentation.py | http://en.wikipedia.org/wiki/Interval_ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/data_utils.py|tdnn/speechbrain/utils/data_utils.py | https://stackoverflow.com/a/3233356 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/data_utils.py|tdnn/speechbrain/utils/data_utils.py | https://github.com/pytorch/pytorch/blob/c0deb231db76dbea8a9d326401417f7d1ce96ed5/torch/utils/data/_utils/collate.py#L42 | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/DER.py|tdnn/speechbrain/utils/DER.py | https://github.com/nryant/dscore | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/edit_distance.py|tdnn/speechbrain/utils/edit_distance.py | https://en.wikipedia.org/wiki/Levenshtein_distance | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/hpopt.py|tdnn/speechbrain/utils/hpopt.py | https://orion.readthedocs.io/en/stable/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/README.md|tdnn/speechbrain/utils/hpopt.py | https://github.com/Epistimio/orion | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/logger.py|tdnn/speechbrain/utils/logger.py | https://fangpenlin.com/posts/2012/08/26/good-logging-practice-in-python/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/superpowers.py|tdnn/speechbrain/utils/superpowers.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/utils/text_to_sequence.py|tdnn/speechbrain/utils/text_to_sequence.py | https://github.com/keithito/tacotron | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/templates/enhancement/mini_librispeech_prepare.py|tdnn/templates/speaker_id/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/train-clean-5.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/ESC50/classification/README.md|tdnn/speechbrain/lobes/models/Cnn14.py | https://arxiv.org/abs/1912.10211 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/ContextNet.py|tdnn/speechbrain/lobes/models/ContextNet.py | https://arxiv.org/pdf/2005.03191.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/ContextNet.py|tdnn/speechbrain/lobes/models/ContextNet.py | https://arxiv.org/pdf/2005.03191.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/ESC50/classification/README.md|tdnn/speechbrain/lobes/models/Cnn14.py | https://arxiv.org/abs/1912.10211 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/dual_path.py|tdnn/speechbrain/lobes/models/dual_path.py | https://fast-transformers.github.io/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/dual_path.py|tdnn/speechbrain/lobes/models/dual_path.py | https://fast-transformers.github.io/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/EnhanceResnet.py|tdnn/speechbrain/lobes/models/EnhanceResnet.py | https://arxiv.org/pdf/2112.06068.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/ECAPA_TDNN.py|tdnn/speechbrain/lobes/models/ECAPA_TDNN.py | https://github.com/pytorch/pytorch/issues/4320 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/ESPnetVGG.py|tdnn/speechbrain/lobes/models/ESPnetVGG.py | https://github.com/espnet/espnet/blob/master/espnet/nets/pytorch_backend/rnn/encoders.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/EnhanceResnet.py|tdnn/speechbrain/lobes/models/EnhanceResnet.py | https://arxiv.org/abs/1709.01507 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonLanguage/lang_id/README.md|tdnn/speechbrain/lobes/models/ECAPA_TDNN.py | https://arxiv.org/abs/2005.07143 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/fairseq_wav2vec.py|tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://arxiv.org/abs/1904.05862 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/fairseq_wav2vec.py|tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://fairseq.readthedocs.io/en/latest/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/LibriTTS/README.md|tdnn/speechbrain/lobes/models/HifiGAN.py | https://arxiv.org/pdf/2010.05646.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/fairseq_wav2vec.py|tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://fairseq.readthedocs.io/en/latest/ | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/fairseq_wav2vec.py|tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://dl.fbaipublicfiles.com/fairseq/wav2vec/wav2vec_small.pt | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/fairseq_wav2vec.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://arxiv.org/abs/1904.05862 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_wav2vec.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://arxiv.org/abs/2110.13900 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_gpt.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://huggingface.co/transformers/installation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_wav2vec.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://arxiv.org/abs/2106.07447 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_gpt.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://huggingface.co/transformers/installation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_gpt.py|tdnn/speechbrain/lobes/models/huggingface_whisper.py | https://huggingface.co/transformers/installation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/ASR/transformer/hparams/train_ar_hf_whisper.yaml|tdnn/speechbrain/lobes/models/huggingface_whisper.py | https://cdn.openai.com/papers/whisper.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_gpt.py|tdnn/speechbrain/lobes/models/huggingface_whisper.py | https://huggingface.co/transformers/installation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_whisper.py|tdnn/speechbrain/lobes/models/huggingface_whisper.py | https://github.com/openai/whisper | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/Aishell1Mix/separation/hparams/sepformer-aishell1mix2-wham.yaml|tdnn/speechbrain/lobes/models/dual_path.py | https://arxiv.org/abs/2010.13154 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_gpt.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://huggingface.co/transformers/installation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_wav2vec.py|tdnn/speechbrain/lobes/models/huggingface_wav2vec.py | https://huggingface.co/transformers/model_doc/wav2vec2.html#wav2vec2forpretraining | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/HifiGAN.py|tdnn/speechbrain/lobes/models/HifiGAN.py | https://arxiv.org/pdf/1910.11480.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/HifiGAN.py|tdnn/speechbrain/lobes/models/HifiGAN.py | https://arxiv.org/pdf/1910.11480.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_whisper.py|tdnn/speechbrain/lobes/models/huggingface_whisper.py | https://github.com/openai/whisper/blob/eff383b27b783e280c089475852ba83f20f64998/whisper/audio.py#L92 | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/ESC50/interpret/README.md|tdnn/speechbrain/lobes/models/L2I.py | https://arxiv.org/abs/2202.11479v2 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/LibriTTS/README.md|tdnn/speechbrain/lobes/models/HifiGAN.py | https://arxiv.org/pdf/2010.05646.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/huggingface_whisper.py|tdnn/speechbrain/lobes/models/huggingface_whisper.py | https://github.com/openai/whisper/blob/eff383b27b783e280c089475852ba83f20f64998/whisper/audio.py#L52 | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/Voicebank/enhance/SEGAN/hparams/train.yaml|tdnn/speechbrain/lobes/models/segan_model.py | https://arxiv.org/pdf/1703.09452.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/PIQ.py|tdnn/speechbrain/lobes/models/PIQ.py | https://arxiv.org/abs/1711.00937 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/MSTacotron2.py|tdnn/speechbrain/lobes/models/Tacotron2.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechSynthesis/Tacotron2/tacotron2/model.py | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/PIQ.py|tdnn/speechbrain/lobes/models/PIQ.py | https://arxiv.org/abs/1711.00937 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/recipes/CommonVoice/self-supervised-learning/wav2vec2/train_hf_wav2vec2.py|tdnn/speechbrain/lobes/models/wav2vec.py | https://arxiv.org/abs/2006.11477 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/lobes/models/wav2vec.py|tdnn/speechbrain/lobes/models/wav2vec.py | https://arxiv.org/abs/2109.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/complex_networks/c_ops.py|tdnn/speechbrain/nnet/complex_networks/c_ops.py | https://discuss.pytorch.org/t/sum-mul-over-multiple-axes/1882/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/guidedattn_loss.py|tdnn/speechbrain/nnet/loss/guidedattn_loss.py | https://arxiv.org/abs/1710.08969 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/guidedattn_loss.py|tdnn/speechbrain/nnet/loss/guidedattn_loss.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/complex_networks/c_normalization.py|tdnn/speechbrain/nnet/complex_networks/c_normalization.py | https://en.wikipedia.org/wiki/Square_root_of_a_2_by_2_matrix | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/complex_networks/c_normalization.py|tdnn/speechbrain/nnet/complex_networks/c_normalization.py | http://mathworld.wolfram.com/MatrixInverse.html | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/transducer_loss.py|tdnn/speechbrain/nnet/loss/transducer_loss.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/complex_networks/c_RNN.py|tdnn/speechbrain/nnet/complex_networks/c_RNN.py | https://arxiv.org/abs/1803.10225 | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/transducer_loss.py|tdnn/speechbrain/nnet/loss/transducer_loss.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/transducer_loss.py|tdnn/speechbrain/nnet/loss/transducer_loss.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/transducer_loss.py|tdnn/speechbrain/nnet/loss/transducer_loss.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/loss/transducer_loss.py|tdnn/speechbrain/nnet/loss/transducer_loss.py | https://arxiv.org/pdf/1211.3711.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/quaternion_networks/q_ops.py|tdnn/speechbrain/nnet/quaternion_networks/q_ops.py | https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/quaternion_networks/q_ops.py|tdnn/speechbrain/nnet/quaternion_networks/q_ops.py | https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/quaternion_networks/q_ops.py|tdnn/speechbrain/nnet/quaternion_networks/q_ops.py | https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/quaternion_networks/q_ops.py|tdnn/speechbrain/nnet/quaternion_networks/q_ops.py | https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/quaternion_networks/q_ops.py|tdnn/speechbrain/nnet/quaternion_networks/q_ops.py | https://en.wikipedia.org/wiki/Quaternions_and_spatial_rotation | 模型相关说明 | -| 开源代码引入 | https://github.com/speechbrain/speechbrain/speechbrain/nnet/complex_networks/c_RNN.py|tdnn/speechbrain/nnet/quaternion_networks/q_RNN.py | https://arxiv.org/abs/1803.10225 | 参考论文地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/audio/tdnn/setup.py | speechbrain@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/tdnn/speechbrain/lobes/augment.py | http://www.openslr.org/resources/28/rirs_noises.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://fairseq.readthedocs.io/en/latest/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/tdnn/templates/speaker_id/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/train-clean-5.tar.gz | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/audio/wav2vec2.0/public_address_statement.md b/PyTorch/contrib/audio/wav2vec2.0/public_address_statement.md index 4b2aa3be35fd9195816cb2a706fee4f81d37c2e3..18fbbf9bc31274db377880f31e7621a520d8f622 100644 --- a/PyTorch/contrib/audio/wav2vec2.0/public_address_statement.md +++ b/PyTorch/contrib/audio/wav2vec2.0/public_address_statement.md @@ -1,212 +1,89 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://test.pypi.org/simple/ | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://github.com/NVIDIA/apex | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://github.com/facebookresearch/xformers.git | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://github.com/NVIDIA/Megatron-LM.git | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.circleci/config.yml | wav2vec2.0/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | 安装conda | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.github/workflows/build.yml | wav2vec2.0/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.github/workflows/build.yml | wav2vec2.0/.github/workflows/build.yml | https://github.com/facebookresearch/fairscale.git | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.github/workflows/build.yml | wav2vec2.0/.github/workflows/build.yml | https://github.com/facebookresearch/xformers.git | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.gitmodules | wav2vec2.0/.gitmodules | https://github.com/ngoyal2707/Megatron-LM | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.pre-commit-config.yaml | wav2vec2.0/.pre-commit-config.yaml | https://github.com/pre-commit/pre-commit-hooks | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.pre-commit-config.yaml | wav2vec2.0/.pre-commit-config.yaml | https://github.com/ambv/black | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.pre-commit-config.yaml | wav2vec2.0/.pre-commit-config.yaml | https://gitlab.com/pycqa/flake8 | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/.pre-commit-config.yaml | wav2vec2.0/.pre-commit-config.yaml | https://github.com/pycqa/isort | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/docs/conf.py | wav2vec2.0/docs/conf.py | https://github.com/pytorch/fairseq/tree/main/docs/ | 开源代码地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/docs/conf.py | wav2vec2.0/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/docs/conf.py | wav2vec2.0/docs/conf.py | https://docs.python.org/ | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/docs/conf.py | wav2vec2.0/docs/conf.py | https://pytorch.org/docs/master/ | 安装三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/encoders/gpt2_bpe.py | wav2vec2.0/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 下载配置文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/encoders/gpt2_bpe.py | wav2vec2.0/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 下载bpe文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv.py | wav2vec2.0/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv.py | wav2vec2.0/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv.py | wav2vec2.0/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv_self_att.py | wav2vec2.0/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv_self_att.py | wav2vec2.0/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv_self_att.py | wav2vec2.0/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 下载数据集 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model.py | wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model.py | wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model.py | wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model.py | wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_camembert.py | wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_gottbert.py | wav2vec2.0/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_xlmr.py | wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_xlmr.py | wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_xlmr.py | wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model_xlmr.py | wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/text_to_speech/tts_transformer.py | wav2vec2.0/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_legacy.py | wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer_lm.py | wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/xmod/model.py | wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/setup.py | wav2vec2.0/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 下载三方库 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/setup.py | wav2vec2.0/setup.py | https://github.com/pytorch/fairseq | 开源地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/tests/speech/__init__.py | wav2vec2.0/tests/speech/__init__.py | https://dl.fbaipublicfiles.com/fairseq | 开源地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/tests/speech/__init__.py | wav2vec2.0/tests/speech/__init__.py | https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/must_c/en_de | 下载配置文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/tests/speech/test_convtransformer_simul_trans.py | wav2vec2.0/tests/speech/test_convtransformer_simul_trans.py | https://dl.fbaipublicfiles.com/fairseq/ | 开源地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/tests/speech/test_dual_input_wav_transformer.py | wav2vec2.0/tests/speech/test_dual_input_wav_transformer.py | https://dl.fbaipublicfiles.com/joint_speech_text_4_s2t/acl2022/librispeech/finetuned | 下载配置文件 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/tests/speech/test_s2s_transformer.py | wav2vec2.0/tests/speech/test_s2s_transformer.py | https://dl.fbaipublicfiles.com/fairseq/ | fairseq开源地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/tests/speech/test_wav2vec2.py | wav2vec2.0/tests/speech/test_wav2vec2.py | https://dl.fbaipublicfiles.com/fairseq | fairseq开源地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/scripts/build_sym_alignment.py | wav2vec2.0/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | fast_align在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/scripts/build_sym_alignment.py | wav2vec2.0/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | mosesdecoder在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/scripts/build_sym_alignment.py | wav2vec2.0/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | statmt.org公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq_cli/train.py | wav2vec2.0/fairseq_cli/train.py | https://github.com/facebookresearch/hydra/issues/1126 | facebookresearch/hydra/issues公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq_cli/hydra_train.py | wav2vec2.0/fairseq_cli/hydra_train.py | https://github.com/facebookresearch/hydra/issues/1126 | facebookresearch/hydra/issues公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/trainer.py | wav2vec2.0/fairseq/trainer.py | https://openreview.net/forum?id=_CMSV7FTzGI | openreview.net公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/tasks/translation_lev.py | wav2vec2.0/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/tasks/translation_lev.py | wav2vec2.0/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/tasks/text_to_speech.py | wav2vec2.0/fairseq/tasks/text_to_speech.py | https://arxiv.org/pdf/2011.03568.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/tasks/fairseq_task.py | wav2vec2.0/fairseq/tasks/fairseq_task.py | https://arxiv.org/abs/2010.00904 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/tasks/fairseq_task.py | wav2vec2.0/fairseq/tasks/fairseq_task.py | https://github.com/facebookresearch/GENRE | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/tasks/cross_lingual_lm.py | wav2vec2.0/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/search.py | wav2vec2.0/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/search.py | wav2vec2.0/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/search.py | wav2vec2.0/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/search.py | wav2vec2.0/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/scoring/tokenizer.py | wav2vec2.0/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | sacrebleu在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | wav2vec2.0/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | wav2vec2.0/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | wav2vec2.0/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/fused_adam.py | wav2vec2.0/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/fused_adam.py | wav2vec2.0/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/bmuf.py | wav2vec2.0/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | ieeexplore.ieee.org源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/adamax.py | wav2vec2.0/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/adam.py | wav2vec2.0/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/adam.py | wav2vec2.0/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/adam.py | wav2vec2.0/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/optim/adafactor.py | wav2vec2.0/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/vggblock.py | wav2vec2.0/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/sparse_multihead_attention.py | wav2vec2.0/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/rotary_positional_embedding.py | wav2vec2.0/fairseq/modules/rotary_positional_embedding.py | https://blog.eleuther.ai/rotary-embeddings/ | blog.eleuther.ai源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/rotary_positional_embedding.py | wav2vec2.0/fairseq/modules/rotary_positional_embedding.py | https://arxiv.org/pdf/2104.09864.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/positional_encoding.py | wav2vec2.0/fairseq/modules/positional_encoding.py | https://arxiv.org/abs/1901.02860 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/lstm_cell_with_zoneout.py | wav2vec2.0/fairseq/modules/lstm_cell_with_zoneout.py | https://arxiv.org/abs/1606.01305 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/location_attention.py | wav2vec2.0/fairseq/modules/location_attention.py | https://arxiv.org/pdf/1506.07503.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/layer_drop.py | wav2vec2.0/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/gelu.py | wav2vec2.0/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | GELU源码在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/espnet_multihead_attention.py | wav2vec2.0/fairseq/modules/espnet_multihead_attention.py | https://arxiv.org/abs/1901.02860 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/dynamic_crf_layer.py | wav2vec2.0/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/dynamic_crf_layer.py | wav2vec2.0/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | torchcrf源码在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/PyTorch/conformer_layer.py | wav2vec2.0/fairseq/modules/conformer_layer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/PyTorch/character_token_embedder.py | wav2vec2.0/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/modules/PyTorch/adaptive_softmax.py | wav2vec2.0/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/wav2vec/utils.py | wav2vec2.0/fairseq/models/wav2vec/utils.py | https://github.com/lucidrains/local-attention/blob/master/local_attention/local_attention.py#L41 | github.com/lucidrains源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/transformer/transformer_base.py | wav2vec2.0/fairseq/models/transformer/transformer_base.py | https://arxiv.org/abs/1706.03762 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/text_to_speech/tts_transformer.py | wav2vec2.0/fairseq/models/text_to_speech/tts_transformer.py | https://arxiv.org/pdf/1809.08895.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/text_to_speech/tacotron2.py | wav2vec2.0/fairseq/models/text_to_speech/tacotron2.py | https://arxiv.org/pdf/1712.05884.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/text_to_speech/fastspeech2.py | wav2vec2.0/fairseq/models/text_to_speech/fastspeech2.py | https://arxiv.org/abs/2006.04558 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/xm_transformer.py | wav2vec2.0/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | s2t在dl.fbaipublicfiles.com上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/s2t_transformer.py | wav2vec2.0/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1911.08460 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/s2t_transformer.py | wav2vec2.0/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/s2t_transformer.py | wav2vec2.0/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/s2t_conformer.py | wav2vec2.0/fairseq/models/speech_to_text/s2t_conformer.py | https://arxiv.org/abs/2005.08100 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/modules/emformer.py | wav2vec2.0/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/1803.02155 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/modules/emformer.py | wav2vec2.0/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/2005.09684 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | wav2vec2.0/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.08042 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | wav2vec2.0/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.09137 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/convtransformer.py | wav2vec2.0/fairseq/models/speech_to_text/convtransformer.py | https://arxiv.org/abs/2004.10234 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | seq2seq在GitHub上的源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | seq2seq在GitHub上的源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | seq2seq在GitHub上的源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_text/berard.py | wav2vec2.0/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/speech_to_speech/s2s_transformer.py | wav2vec2.0/fairseq/models/speech_to_speech/s2s_transformer.py | https://arxiv.org/abs/2107.05604 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/model.py | wav2vec2.0/fairseq/models/roberta/model.py | https://openreview.net/forum?id=_CMSV7FTzGI | openreview.net源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/hub_interface.py | wav2vec2.0/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/roberta | roberta在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/roberta/hub_interface.py | wav2vec2.0/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/lightconv.py | wav2vec2.0/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | openreview.net源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fconv.py | wav2vec2.0/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/fairseq_incremental_decoder.py | wav2vec2.0/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq | telesens.co源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/ema/ema.py | wav2vec2.0/fairseq/models/ema/ema.py | https://github.com/zhawe01/fairseq-gec.git | fairseq-gec.git在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/ema/ema.py | wav2vec2.0/fairseq/models/ema/ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | tensorflow.org源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/bart/model.py | wav2vec2.0/fairseq/models/bart/model.py | / | bart模型权重在开源网站的地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/bart/model.py | wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | bart模型权重在开源网站的地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/bart/model.py | wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | bart模型权重在开源网站的地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/bart/model.py | wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | bart模型权重在开源网站的地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/bart/model.py | wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | bart模型权重在开源网站的地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/models/bart/hub_interface.py | wav2vec2.0/fairseq/models/bart/hub_interface.py | https://github.com/pytorch/fairseq/tree/main/examples/bart | fairseq/tree/main/examples/bart源码在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/model_parallel/modules/transformer_layer.py | wav2vec2.0/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/model_parallel/modules/multihead_attention.py | wav2vec2.0/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/model_parallel/modules/file_utils.py | wav2vec2.0/fairseq/file_utils.py | https://github.com/allenai/allennlp | allennlp在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/model_parallel/modules/file_utils.py | wav2vec2.0/fairseq/file_utils.py | https://github.com/huggingface | huggingface源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/dataclass/constants.py | wav2vec2.0/fairseq/dataclass/constants.py | https://github.com/facebookresearch/hydra/issues/1156 | facebookresearch/hydra/issues公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/dataclass/configs.py | wav2vec2.0/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | fairscale.readthedocs.io源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/dataclass/configs.py | wav2vec2.0/fairseq/dataclass/configs.py | https://github.com/facebookresearch/hydra/issues/1117 | facebookresearch/hydra/issues公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/mask_tokens_dataset.py | wav2vec2.0/fairseq/data/mask_tokens_dataset.py | https://arxiv.org/abs/1910.05453 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/indexed_dataset.py | wav2vec2.0/fairseq/data/indexed_dataset.py | https://github.com/numpy/numpy/issues/5745 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/encoders/gpt2_bpe_utils.py | wav2vec2.0/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | openai/gpt-2/encoder.py源码公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/audio/speech_to_text_dataset.py | wav2vec2.0/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/data/audio/feature_transforms/specaugment.py | wav2vec2.0/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/criterions/tacotron2_loss.py | wav2vec2.0/fairseq/criterions/tacotron2_loss.py | https://arxiv.org/abs/1710.08969 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/criterions/adaptive_loss.py | wav2vec2.0/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/clib/libnat_cuda/binding.cpp | wav2vec2.0/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | pytorch-edit-distance在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/clib/libbase/balanced_assignment.cpp | wav2vec2.0/fairseq/clib/libbase/balanced_assignment.cpp | https://dspace.mit.edu/bitstream/handle/1721.1/3265/P-2108-26912652.pdf | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/clib/libbase/balanced_assignment.cpp | wav2vec2.0/fairseq/clib/libbase/balanced_assignment.cpp | https://github.com/bkj/auction-lap | auction-lap在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/checkpoint_utils.py | wav2vec2.0/fairseq/checkpoint_utils.py | https://pypi.org/project/huggingface-hub | huggingface-hub在pypi.org上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/fairseq/checkpoint_utils.py | wav2vec2.0/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/unsupervised/w2vu_generate.py | wav2vec2.0/examples/wav2vec/unsupervised/w2vu_generate.py | https://github.com/facebookresearch/hydra/issues/1126 | facebookresearch/hydra/issues公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/unsupervised/scripts/vads.py | wav2vec2.0/examples/wav2vec/unsupervised/scripts/vads.py | https://github.com/zhenghuatan/rVADfast | rVADfast在github上的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/unsupervised/scripts/normalize_and_filter_text.py | wav2vec2.0/examples/wav2vec/unsupervised/scripts/normalize_and_filter_text.py | https://fasttext.cc/docs/en/language-identification.html | fasttext.cc引用的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/unsupervised/kaldi_self_train/st/steps_gan/train_lda_mllt.sh | wav2vec2.0/examples/wav2vec/unsupervised/kaldi_self_train/st/steps_gan/train_lda_mllt.sh | http://kaldi-asr.org/doc/transform.html | kaldi-asr.org引用的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/unsupervised/kaldi_self_train/st/cmd.sh | wav2vec2.0/examples/wav2vec/unsupervised/kaldi_self_train/st/cmd.sh | http://kaldi-asr.org/doc/queue.html | kaldi-asr.org引用的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/docs/make.bat | wav2vec2.0/docs/make.bat | http://sphinx-doc.org | sphinx-doc.org引用的公网来源说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/a0ceabc287e26f64517fadb13a54c83b71e8e469/examples/wav2vec/docs/conf.py | wav2vec2.0/docs/conf.py | http://alabaster.readthedocs.io/en/latest/installation.html#sidebars | alabaster.readthedocs.io源码公网来源说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/.circleci/config.yml | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/dataclass/configs.py | https://fairscale.readthedocs.io/en/latest/api/experimental/nn/slowmo_ddp.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/speech_to_text/s2t_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/speech_to_text/xm_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2t | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/text_to_speech/fastspeech2.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/text_to_speech/tts_transformer.py | http://dl.fbaipublicfiles.com/fairseq/s2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.large.prenorm.81.500k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.269k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.75.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.265k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.60.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.195k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.30.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.13.125k.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/fairseq/models/xmod/model.py | https://dl.fbaipublicfiles.com/fairseq/models/xmod/xmod.base.81.1M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/audio/wav2vec2.0/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/public_address_statement.md b/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/public_address_statement.md index 9b66a131d0a743dfd15a45583f38f0f8691f98e3..d38c487260fc3f29e8d1d43ae1a1e73c141a06dc 100644 --- a/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/public_address_statement.md @@ -1,99 +1,134 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|--------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -|开源代码引入|https://github.com/HuangJunJie2017/BEVDet.git|tools/train.py|https://mmdetection3d.readthedocs.io/en/latest/tutorials/customize_runtime.html|源码实现| -|开源代码引入|https://github.com/HuangJunJie2018/BEVDet.git|tools/train.py|https://arxiv.org/abs/1706.02677|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2019/BEVDet.git|tools/data_converter/waymo_converter.py|https://github.com/caizhongang/waymo_kitti_converter|源码实现| -|开源代码引入|https://github.com/HuangJunJie2020/BEVDet.git|tools/data_converter/scannet_data_utils.py|https://github.com/charlesq34/pointnet2/blob/master/scannet/scannet_dataset.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2021/BEVDet.git|tools/data_converter/s3dis_data_utils.py|https://github.com/charlesq34/pointnet2/blob/master/scannet/scannet_dataset.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2022/BEVDet.git|tools/data_converter/s3dis_data_utils.py|https://arxiv.org/abs/2006.12356|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2023/BEVDet.git|tools/data_converter/lyft_data_fixer.py|https://www.kaggle.com/c/3d-object-detection-for-autonomous-vehicles/discussion/110000|源码实现| -|开源代码引入|https://github.com/HuangJunJie2024/BEVDet.git|tools/data_converter/indoor_converter.py|https://github.com/charlesq34/pointnet2/blob/master/scannet/scannet_dataset.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2025/BEVDet.git|tools/convert_bevdet_to_TRT.py|https://github.com/HuangJunJie2017/mmdeploy.git|源码实现| -|开源代码引入|https://github.com/HuangJunJie2026/BEVDet.git|mmdet3d/utils/setup_env.py|https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2027/BEVDet.git|mmdet3d/utils/misc.py|https://github.com/microsoft/SoftTeacher/blob/main/ssod/utils/patch.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2028/BEVDet.git|mmdet3d/ops/pointnet_modules/paconv_sa_module.py|https://arxiv.org/abs/2103.14635|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2029/BEVDet.git|mmdet3d/ops/pointnet_modules/paconv_sa_module.py|https://arxiv.org/abs/2103.14635|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2030/BEVDet.git|mmdet3d/ops/pointnet_modules/paconv_sa_module.py|https://arxiv.org/abs/2103.14635|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2031/BEVDet.git|mmdet3d/ops/pointnet_modules/paconv_sa_module.py|https://arxiv.org/abs/2103.14635|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2032/BEVDet.git|mmdet3d/ops/paconv/paconv.py|https://arxiv.org/pdf/2103.14635.pdf|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2033/BEVDet.git|mmdet3d/ops/paconv/paconv.py|https://github.com/CVMI-Lab/PAConv/blob/main/scene_seg/model/pointnet2/paconv.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2034/BEVDet.git|mmdet3d/ops/norm.py|https://github.com/facebookresearch/detectron2/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2035/BEVDet.git|mmdet3d/ops/norm.py|https://github.com/facebookresearch/detectron2/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2036/BEVDet.git|mmdet3d/ops/bev_pool_v2/bev_pool.py|https://arxiv.org/abs/2211.17111|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2037/BEVDet.git|mmdet3d/models/necks/view_transformer.py|https://arxiv.org/abs/2211.17111|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2038/BEVDet.git|mmdet3d/models/necks/view_transformer.py|https://arxiv.org/abs/2008.05711|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2039/BEVDet.git|mmdet3d/models/necks/pointnet2_fp_neck.py|https://arxiv.org/abs/2211.17111|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2040/BEVDet.git|mmdet3d/models/necks/fpn.py|https://arxiv.org/abs/1612.03144|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2041/BEVDet.git|mmdet3d/models/losses/uncertain_smooth_l1_loss.py|https://arxiv.org/abs/1705.07115|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2042/BEVDet.git|mmdet3d/models/losses/uncertain_smooth_l1_loss.py|https://arxiv.org/abs/2107.14160|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2043/BEVDet.git|mmdet3d/models/losses/paconv_regularization_loss.py|https://github.com/CVMI-Lab/PAConv/blob/main/scene_seg/tool/train.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2044/BEVDet.git|mmdet3d/models/detectors/voxelnet.py|https://arxiv.org/abs/1711.06396|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2045/BEVDet.git|mmdet3d/models/detectors/votenet.py|https://arxiv.org/pdf/1904.09664.pdf|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2046/BEVDet.git|mmdet3d/models/detectors/ssd3dnet.py|https://arxiv.org/abs/2002.10187.pdf|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2047/BEVDet.git|mmdet3d/models/detectors/sassd.py|https://github.com/skyhehe123/SA-SSD|源码实现| -|开源代码引入|https://github.com/HuangJunJie2048/BEVDet.git|mmdet3d/models/detectors/parta2.py|https://arxiv.org/abs/1907.03670|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2049/BEVDet.git|mmdet3d/models/detectors/imvoxelnet.py|https://arxiv.org/abs/2106.01178|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2050/BEVDet.git|mmdet3d/models/backbones/swin.py|https://github.com/microsoft/Swin-Transformer|源码实现| -|开源代码引入|https://github.com/HuangJunJie2051/BEVDet.git|mmdet3d/models/backbones/swin.py|https://arxiv.org/abs/2103.14030|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2052/BEVDet.git|mmdet3d/models/backbones/nostem_regnet.py|https://arxiv.org/abs/2003.13678|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2053/BEVDet.git|mmdet3d/models/backbones/mink_resnet.py|https://arxiv.org/abs/1904.08755|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2054/BEVDet.git|mmdet3d/models/backbones/mink_resnet.py|https://github.com/NVIDIA/MinkowskiEngine/blob/master/examples/resnet.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2055/BEVDet.git|mmdet3d/models/backbones/dla.py|https://arxiv.org/abs/1707.06484|参考论文地址| -|开源代码引入|https://github.com/HuangJunJie2056/BEVDet.git|mmdet3d/core/hook/ema.py|https://github.com/Megvii-Base|源码实现| -|开源代码引入|https://github.com/HuangJunJie2057/BEVDet.git|mmdet3d/core/hook/ema.py|https://www.tensorflow.org/api_docs/python/tf/train/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2058/BEVDet.git|mmdet3d/core/hook/ema.py|https://github.com/rwightman/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2059/BEVDet.git|mmdet3d/apis/train.py|https://github.com/open-mmlab/mmcv/pull/1193|源码实现| -|开源代码引入|https://github.com/HuangJunJie2060/BEVDet.git|mmdet3d/apis/train.py|https://github.com/open-mmlab/mmcv/pull/1193|源码实现| -|开源代码引入|https://github.com/HuangJunJie2061/BEVDet.git|mmdet3d/apis/train.py|https://github.com/open-mmlab/mmdetection/issues/6339|源码实现| -|开源代码引入|https://github.com/HuangJunJie2062/BEVDet.git|docs/zh_cn/stat.py|https://github.com/open-mmlab/mmdetection3d/blob/master/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2063/BEVDet.git|docs/zh_cn/conf.py|https://github.com/open-mmlab/mmdetection|源码实现| -|开源代码引入|https://github.com/HuangJunJie2064/BEVDet.git|docs/zh_cn/conf.py|https://github.com/open-mmlab/mmcv|源码实现| -|开源代码引入|https://github.com/HuangJunJie2065/BEVDet.git|docs/zh_cn/conf.py|https://github.com/open-mmlab/mmdetection3d|源码实现| -|开源代码引入|https://github.com/HuangJunJie2066/BEVDet.git|docs/zh_cn/conf.py|https://mmocr.readthedocs.io/en/latest/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2067/BEVDet.git|docs/zh_cn/conf.py|https://www.sphinx-doc.org/en/master/usage/configuration.html|源码实现| -|开源代码引入|https://github.com/HuangJunJie2068/BEVDet.git|docs/en/stat.py|https://github.com/open-mmlab/mmdetection3d/blob/master/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2069/BEVDet.git|docs/en/conf.py|https://github.com/open-mmlab/mmdetection|源码实现| -|开源代码引入|https://github.com/HuangJunJie2070/BEVDet.git|docs/en/conf.py|https://github.com/open-mmlab/mmcv|源码实现| -|开源代码引入|https://github.com/HuangJunJie2071/BEVDet.git|docs/en/conf.py|https://github.com/open-mmlab/mmdetection3d|源码实现| -|开源代码引入|https://github.com/HuangJunJie2072/BEVDet.git|docs/en/conf.py|https://mmocr.readthedocs.io/en/latest/|源码实现| -|开源代码引入|https://github.com/HuangJunJie2073/BEVDet.git|docs/en/conf.py|https://www.sphinx-doc.org/en/master/usage/configuration.html|源码实现| -|开源代码引入|https://github.com/HuangJunJie2074/BEVDet.git|data/scannet/scannet_utils.py|https://github.com/ScanNet/ScanNet/blob/master/BenchmarkScripts|源码实现| -|开源代码引入|https://github.com/HuangJunJie2075/BEVDet.git|data/scannet/scannet_utils.py|https://github.com/facebookresearch/votenet/blob/master/scannet/scannet_utils.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2076/BEVDet.git|data/scannet/load_scannet_data.py|https://github.com/facebookresearch/votenet/blob/master/scannet/load_scannet_data.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2077/BEVDet.git|data/scannet/extract_posed_images.py|https://github.com/ScanNet/ScanNet/blob/master/SensReader/python/SensorData.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2078/BEVDet.git|data/scannet/batch_load_scannet_data.py|https://github.com/facebookresearch/votenet/blob/master/scannet/batch_load_scannet_data.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2079/BEVDet.git|data/s3dis/collect_indoor3d_data.py|https://github.com/AnTao97/dgcnn.pytorch/blob/843abe82dd731eb51a4b3f70632c2ed3c60560e9/prepare_data/collect_indoor3d_data.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2080/BEVDet.git|configs/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2081/BEVDet.git|configs/nuimages/mask_rcnn_r50_fpn_coco-2x_1x_nuim.py|https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2082/BEVDet.git|configs/nuimages/mask_rcnn_r50_caffe_fpn_coco-3x_20e_nuim.py|http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2083/BEVDet.git|configs/nuimages/mask_rcnn_r50_caffe_fpn_coco-3x_1x_nuim.py|https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2084/BEVDet.git|configs/nuimages/htc_x101_64x4d_fpn_dconv_c3-c5_coco-20e_16x1_20e_nuim.py|http://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco_20200312-946fd751.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2085/BEVDet.git|configs/nuimages/htc_r50_fpn_coco-20e_1x_nuim.py|http://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_20e_coco/htc_r50_fpn_20e_coco_20200319-fe28c577.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2086/BEVDet.git|configs/nuimages/cascade_mask_rcnn_r50_fpn_coco-20e_20e_nuim.py|http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2087/BEVDet.git|configs/nuimages/cascade_mask_rcnn_r50_fpn_coco-20e_1x_nuim.py|http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2088/BEVDet.git|configs/mvxnet/dv_mvx-fpn_second_secfpn_adamw_2x8_80e_kitti-3d-3class.py|https://download.openmmlab.com/mmdetection3d/pretrain_models/mvx_faster_rcnn_detectron2-caffe_20e_coco-pretrain_gt-sample_kitti-3-class_moderate-79.3_20200207-a4a6a3c7.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2089/BEVDet.git|configs/imvotenet/imvotenet_stage2_16x8_sunrgbd-3d-10class.py|https://download.openmmlab.com/mmdetection3d/v0.1.0_models/imvotenet/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class_20210323_173222-cad62aeb.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2090/BEVDet.git|configs/imvotenet/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class.py|http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2091/BEVDet.git|configs/free_anchor/hv_pointpillars_regnet-3.2gf_fpn_sbn-all_free-anchor_strong-aug_4x8_3x_nus-3d.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2092/BEVDet.git|configs/free_anchor/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_free-anchor_strong-aug_4x8_3x_nus-3d.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2093/BEVDet.git|configs/bevdet/bevdet-stbase-4d-stereo-512x1408-cbgs.py|https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2094/BEVDet.git|configs/_base_/schedules/cyclic_40e.py|https://github.com/open-mmlab/mmcv/blob/f48241a65aebfe07db122e9db320c31b685dc674/mmcv/runner/hooks/momentum_updater.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2095/BEVDet.git|configs/_base_/schedules/cyclic_40e.py|https://github.com/open-mmlab/mmcv/blob/f48241a65aebfe07db122e9db320c31b685dc674/mmcv/runner/hooks/lr_updater.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2096/BEVDet.git|configs/_base_/schedules/cyclic_40e.py|https://github.com/traveller59/second.pytorch/blob/3aba19c9688274f75ebb5e576f65cfe54773c021/torchplus/train/learning_schedules_fastai.py|源码实现| -|开源代码引入|https://github.com/HuangJunJie2097/BEVDet.git|configs/_base_/models/smoke.py|http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth|权重文件| -|开源代码引入|https://github.com/HuangJunJie2098/BEVDet.git|configs/_base_/default_runtime.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.runner.LoggerHook|源码实现| -|开源代码引入|https://github.com/HuangJunJie2099/BEVDet.git|configs/_base_/datasets/waymoD5-3d-car.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2100/BEVDet.git|configs/_base_/datasets/waymoD5-3d-3class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2101/BEVDet.git|configs/_base_/datasets/sunrgbd-3d-10class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2102/BEVDet.git|configs/_base_/datasets/scannet_seg-3d-20class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2103/BEVDet.git|configs/_base_/datasets/scannet-3d-18class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2104/BEVDet.git|configs/_base_/datasets/s3dis_seg-3d-13class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2105/BEVDet.git|configs/_base_/datasets/range100_lyft-3d.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2106/BEVDet.git|configs/_base_/datasets/nus-3d.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2107/BEVDet.git|configs/_base_/datasets/lyft-3d.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2108/BEVDet.git|configs/_base_/datasets/kitti-3d-car.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2109/BEVDet.git|configs/_base_/datasets/kitti-3d-3class.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2110/BEVDet.git|configs/3dssd/3dssd_4x4_kitti-3d-car.py|https://mmcv.readthedocs.io/en/latest/api.html#mmcv.fileio.FileClient|源码实现| -|开源代码引入|https://github.com/HuangJunJie2111/BEVDet.git|setup.py|https://github.com/open-mmlab/mmdetection3d|源码实现| -|开源代码引入|https://github.com/HuangJunJie2112/BEVDet.git|setup.py|http://setuptools.readthedocs.io/en/latest/setuptools.html|源码实现| -|开源代码引入|https://github.com/open-mmlab/mmcv.git|mmcv_replace/parallel/distributed.py|https://github.com/open-mmlab/mmsegmentation/issues/1742|源码实现| +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/_base_/models/smoke.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/3dssd/metafile.yml | https://arxiv.org/abs/2002.10187 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/3dssd/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/3dssd/3dssd_4x4_kitti-3d-car/3dssd_4x4_kitti-3d-car_20210818_203828-b89c8fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://arxiv.org/abs/2006.11275 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/centerpoint/centerpoint_01voxel_second_secfpn_circlenms_4x8_cyclic_20e_nus/centerpoint_01voxel_second_secfpn_circlenms_4x8_cyclic_20e_nus_20210815_085857-9ba7f3a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/centerpoint/centerpoint_01voxel_second_secfpn_dcn_circlenms_4x8_cyclic_20e_nus/centerpoint_01voxel_second_secfpn_dcn_circlenms_4x8_cyclic_20e_nus_20201004_075317-26d8176c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/centerpoint/centerpoint_0075voxel_second_secfpn_circlenms_4x8_cyclic_20e_nus/centerpoint_0075voxel_second_secfpn_circlenms_4x8_cyclic_20e_nus_20200925_230905-358fbe3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/centerpoint/centerpoint_0075voxel_second_secfpn_dcn_circlenms_4x8_cyclic_20e_nus/centerpoint_0075voxel_second_secfpn_dcn_circlenms_4x8_cyclic_20e_nus_20200930_201619-67c8496f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/centerpoint/centerpoint_02pillar_second_secfpn_circlenms_4x8_cyclic_20e_nus/centerpoint_02pillar_second_secfpn_circlenms_4x8_cyclic_20e_nus_20210816_064624-0f3299c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/centerpoint/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/centerpoint/centerpoint_02pillar_second_secfpn_dcn_4x8_cyclic_20e_nus/centerpoint_02pillar_second_secfpn_dcn_4x8_cyclic_20e_nus_20200930_103722-3bb135f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/dgcnn/metafile.yml | https://arxiv.org/abs/1801.07829 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/dgcnn/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.17.0_models/dgcnn/dgcnn_32x4_cosine_100e_s3dis_seg-3d-13class/area5/dgcnn_32x4_cosine_100e_s3dis_seg-3d-13class_20210730_235824-f277e0c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/dynamic_voxelization/metafile.yml | https://arxiv.org/abs/1910.06528 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/dynamic_voxelization/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/dynamic_voxelization/dv_second_secfpn_2x8_cosine_80e_kitti-3d-3class/dv_second_secfpn_2x8_cosine_80e_kitti-3d-3class_20210831_054106-e742d163.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/dynamic_voxelization/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/dynamic_voxelization/dv_pointpillars_secfpn_6x8_160e_kitti-3d-car/dv_pointpillars_secfpn_6x8_160e_kitti-3d-car_20200620_230844-ee7b75c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/fcaf3d/metafile.yml | https://arxiv.org/abs/2112.00322 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/fcos3d/metafile.yml | https://arxiv.org/abs/2104.10956 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/fcos3d/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/fcos3d/fcos3d_r101_caffe_fpn_gn-head_dcn_2x8_1x_nus-mono3d/fcos3d_r101_caffe_fpn_gn-head_dcn_2x8_1x_nus-mono3d_20210425_181341-8d5a21fe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/fcos3d/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/fcos3d/fcos3d_r101_caffe_fpn_gn-head_dcn_2x8_1x_nus-mono3d_finetune/fcos3d_r101_caffe_fpn_gn-head_dcn_2x8_1x_nus-mono3d_finetune_20210427_091419-35aaaad0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://arxiv.org/abs/1909.02466 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/free_anchor/hv_pointpillars_fpn_sbn-all_free-anchor_4x8_2x_nus-3d/hv_pointpillars_fpn_sbn-all_free-anchor_4x8_2x_nus-3d_20210816_163441-ae0897e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/free_anchor/hv_pointpillars_regnet-400mf_fpn_sbn-all_free-anchor_4x8_2x_nus-3d/hv_pointpillars_regnet-400mf_fpn_sbn-all_free-anchor_4x8_2x_nus-3d_20210827_213939-a2dd3fff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/free_anchor/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_free-anchor_4x8_2x_nus-3d/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_free-anchor_4x8_2x_nus-3d_20210828_025608-bfbd506e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/free_anchor/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_free-anchor_strong-aug_4x8_3x_nus-3d/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_free-anchor_strong-aug_4x8_3x_nus-3d_20210827_184909-14d2dbd1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/free_anchor/hv_pointpillars_regnet-3.2gf_fpn_sbn-all_free-anchor_4x8_2x_nus-3d/hv_pointpillars_regnet-3.2gf_fpn_sbn-all_free-anchor_4x8_2x_nus-3d_20210827_181237-e385c35a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/free_anchor/hv_pointpillars_regnet-3.2gf_fpn_sbn-all_free-anchor_strong-aug_4x8_3x_nus-3d/hv_pointpillars_regnet-3.2gf_fpn_sbn-all_free-anchor_strong-aug_4x8_3x_nus-3d_20210828_030816-06708918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/groupfree3d/metafile.yml | https://arxiv.org/abs/2104.00678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/groupfree3d/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/groupfree3d/groupfree3d_8x4_scannet-3d-18class-L6-O256/groupfree3d_8x4_scannet-3d-18class-L6-O256_20210702_145347-3499eb55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/groupfree3d/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/groupfree3d/groupfree3d_8x4_scannet-3d-18class-L12-O256/groupfree3d_8x4_scannet-3d-18class-L12-O256_20210702_150907-1c5551ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/groupfree3d/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/groupfree3d/groupfree3d_8x4_scannet-3d-18class-w2x-L12-O256/groupfree3d_8x4_scannet-3d-18class-w2x-L12-O256_20210702_200301-944f0ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/groupfree3d/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/groupfree3d/groupfree3d_8x4_scannet-3d-18class-w2x-L12-O512/groupfree3d_8x4_scannet-3d-18class-w2x-L12-O512_20210702_220204-187b71c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/h3dnet/metafile.yml | https://arxiv.org/abs/2006.05682 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/h3dnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/h3dnet/h3dnet_3x8_scannet-3d-18class/h3dnet_3x8_scannet-3d-18class_20210824_003149-414bd304.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvotenet/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class.py | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvotenet/imvotenet_stage2_16x8_sunrgbd-3d-10class.py | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/imvotenet/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class_20210323_173222-cad62aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvotenet/metafile.yml | https://arxiv.org/abs/2001.10692 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvotenet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/imvotenet/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class/imvotenet_faster_rcnn_r50_fpn_2x4_sunrgbd-3d-10class_20210819_225618-62eba6ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvotenet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/imvotenet/imvotenet_stage2_16x8_sunrgbd-3d-10class/imvotenet_stage2_16x8_sunrgbd-3d-10class_20210819_192851-1bcd1b97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvotenet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/imvotenet/imvotenet_stage2_16x8_sunrgbd-3d-10class/imvotenet_stage2_16x8_sunrgbd-3d-10class_20210819_192851-1bcd1b97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvoxelnet/metafile.yml | https://arxiv.org/abs/2106.01178 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/imvoxelnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/imvoxelnet/imvoxelnet_4x8_kitti-3d-car/imvoxelnet_4x8_kitti-3d-car_20210830_003014-3d0ffdf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/monoflex/metafile.yml | https://arxiv.org/abs/2104.02323 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/monoflex/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/monoflex/monoflex_dla34_pytorch_dlaneck_gn-all_2x4_6x_kitti-mono3d_20211228_027553-d46d9bb0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/monoflex/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/dynamic_voxelization/dv_second_secfpn_6x8_80e_kitti-3d-car/dv_second_secfpn_6x8_80e_kitti-3d-car_20200620_235228-ac2c1c0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/mvxnet/dv_mvx-fpn_second_secfpn_adamw_2x8_80e_kitti-3d-3class.py | https://download.openmmlab.com/mmdetection3d/pretrain_models/mvx_faster_rcnn_detectron2-caffe_20e_coco-pretrain_gt-sample_kitti-3-class_moderate-79.3_20200207-a4a6a3c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/mvxnet/metafile.yml | https://arxiv.org/abs/1904.01649 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/mvxnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/mvxnet/dv_mvx-fpn_second_secfpn_adamw_2x8_80e_kitti-3d-3class/dv_mvx-fpn_second_secfpn_adamw_2x8_80e_kitti-3d-3class_20210831_060805-83442923.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/cascade_mask_rcnn_r50_fpn_coco-20e_1x_nuim.py | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/cascade_mask_rcnn_r50_fpn_coco-20e_20e_nuim.py | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/htc_r50_fpn_coco-20e_1x_nuim.py | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_20e_coco/htc_r50_fpn_20e_coco_20200319-fe28c577.pth | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/htc_x101_64x4d_fpn_dconv_c3-c5_coco-20e_16x1_20e_nuim.py | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco_20200312-946fd751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/mask_rcnn_r50_caffe_fpn_coco-3x_1x_nuim.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/mask_rcnn_r50_caffe_fpn_coco-3x_20e_nuim.py | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/mask_rcnn_r50_fpn_coco-2x_1x_nuim.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://arxiv.org/abs/1703.06870v3 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://arxiv.org/abs/1901.07518 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/mask_rcnn_r50_fpn_1x_nuim/mask_rcnn_r50_fpn_1x_nuim_20201008_195238-e99f5182.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/mask_rcnn_r50_fpn_coco-2x_1x_nuim/mask_rcnn_r50_fpn_coco-2x_1x_nuim_20201008_195238-b1742a60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/nuimages_semseg/mask_rcnn_r50_caffe_fpn_1x_nuim/mask_rcnn_r50_caffe_fpn_1x_nuim_20220718_195238-b1742a60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/mask_rcnn_r50_caffe_fpn_coco-3x_1x_nuim/mask_rcnn_r50_caffe_fpn_coco-3x_1x_nuim_20201008_195305-661a992e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/mask_rcnn_r50_caffe_fpn_coco-3x_20e_nuim/mask_rcnn_r50_caffe_fpn_coco-3x_20e_nuim_20201009_125002-5529442c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/mask_rcnn_r101_fpn_1x_nuim/mask_rcnn_r101_fpn_1x_nuim_20201024_134803-65c7623a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/mask_rcnn_x101_32x4d_fpn_1x_nuim/mask_rcnn_x101_32x4d_fpn_1x_nuim_20201024_135741-b699ab37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/cascade_mask_rcnn_r50_fpn_1x_nuim/cascade_mask_rcnn_r50_fpn_1x_nuim_20201008_195342-1147c036.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/cascade_mask_rcnn_r50_fpn_coco-20e_1x_nuim/cascade_mask_rcnn_r50_fpn_coco-20e_1x_nuim_20201009_124158-ad0540e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/cascade_mask_rcnn_r50_fpn_coco-20e_20e_nuim/cascade_mask_rcnn_r50_fpn_coco-20e_20e_nuim_20201009_124951-40963960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/cascade_mask_rcnn_r101_fpn_1x_nuim/cascade_mask_rcnn_r101_fpn_1x_nuim_20201024_134804-45215b1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/cascade_mask_rcnn_x101_32x4d_fpn_1x_nuim/cascade_mask_rcnn_x101_32x4d_fpn_1x_nuim_20201024_135753-e0e49778.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/htc_r50_fpn_coco-20e_1x_nuim/htc_r50_fpn_coco-20e_1x_nuim_20201010_070203-0b53a65e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/htc_r50_fpn_coco-20e_20e_nuim/htc_r50_fpn_coco-20e_20e_nuim_20201008_211415-d6c60a2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/nuimages/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/nuimages_semseg/htc_x101_64x4d_fpn_dconv_c3-c5_coco-20e_16x1_20e_nuim/htc_x101_64x4d_fpn_dconv_c3-c5_coco-20e_16x1_20e_nuim_20201008_211222-0b16ac4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/paconv/metafile.yml | https://arxiv.org/abs/2103.14635 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/paconv/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/paconv/paconv_ssg_8x8_cosine_150e_s3dis_seg-3d-13class/paconv_ssg_8x8_cosine_150e_s3dis_seg-3d-13class_20210729_200615-2147b2d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/parta2/metafile.yml | https://arxiv.org/abs/1907.03670 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/parta2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/parta2/hv_PartA2_secfpn_2x8_cyclic_80e_kitti-3d-3class/hv_PartA2_secfpn_2x8_cyclic_80e_kitti-3d-3class_20210831_022017-454a5344.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/parta2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/parta2/hv_PartA2_secfpn_2x8_cyclic_80e_kitti-3d-car/hv_PartA2_secfpn_2x8_cyclic_80e_kitti-3d-car_20210831_022017-cb7ff621.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pgd/metafile.yml | https://arxiv.org/abs/2107.14160 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pgd/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pgd/pgd_r101_caffe_fpn_gn-head_3x4_4x_kitti-mono3d/pgd_r101_caffe_fpn_gn-head_3x4_4x_kitti-mono3d_20211022_102608-8a97533b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pgd/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pgd/pgd_r101_caffe_fpn_gn-head_2x16_1x_nus-mono3d/pgd_r101_caffe_fpn_gn-head_2x16_1x_nus-mono3d_20211116_195350-f4b5eec2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pgd/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pgd/pgd_r101_caffe_fpn_gn-head_2x16_1x_nus-mono3d_finetune/pgd_r101_caffe_fpn_gn-head_2x16_1x_nus-mono3d_finetune_20211118_093245-fd419681.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pgd/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pgd/pgd_r101_caffe_fpn_gn-head_2x16_2x_nus-mono3d/pgd_r101_caffe_fpn_gn-head_2x16_2x_nus-mono3d_20211112_125314-cb677266.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pgd/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pgd/pgd_r101_caffe_fpn_gn-head_2x16_2x_nus-mono3d_finetune/pgd_r101_caffe_fpn_gn-head_2x16_2x_nus-mono3d_finetune_20211114_162135-5ec7c1cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/point_rcnn/metafile.yml | https://arxiv.org/abs/1812.04244 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/point_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/point_rcnn/point_rcnn_2x8_kitti-3d-3classes_20211208_151344.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://arxiv.org/abs/1706.02413 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointnet2/pointnet2_ssg_xyz-only_16x2_cosine_200e_scannet_seg-3d-20class/pointnet2_ssg_xyz-only_16x2_cosine_200e_scannet_seg-3d-20class_20210514_143628-4e341a48.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointnet2/pointnet2_ssg_16x2_cosine_200e_scannet_seg-3d-20class/pointnet2_ssg_16x2_cosine_200e_scannet_seg-3d-20class_20210514_143644-ee73704a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointnet2/pointnet2_msg_xyz-only_16x2_cosine_250e_scannet_seg-3d-20class/pointnet2_msg_xyz-only_16x2_cosine_250e_scannet_seg-3d-20class_20210514_143838-b4a3cf89.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointnet2/pointnet2_msg_16x2_cosine_250e_scannet_seg-3d-20class/pointnet2_msg_16x2_cosine_250e_scannet_seg-3d-20class_20210514_144009-24477ab1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointnet2/pointnet2_ssg_16x2_cosine_50e_s3dis_seg-3d-13class/pointnet2_ssg_16x2_cosine_50e_s3dis_seg-3d-13class_20210514_144205-995d0119.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointnet2/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointnet2/pointnet2_msg_16x2_cosine_80e_s3dis_seg-3d-13class/pointnet2_msg_16x2_cosine_80e_s3dis_seg-3d-13class_20210514_144307-b2059817.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://arxiv.org/abs/1812.05784 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-car/hv_pointpillars_secfpn_6x8_160e_kitti-3d-car_20220331_134606-d42d15ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pointpillars/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class/hv_pointpillars_secfpn_6x8_160e_kitti-3d-3class_20220301_150306-37dc2420.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pointpillars/hv_pointpillars_secfpn_sbn-all_4x8_2x_nus-3d/hv_pointpillars_secfpn_sbn-all_4x8_2x_nus-3d_20210826_225857-f19d00a3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pointpillars/hv_pointpillars_fpn_sbn-all_4x8_2x_nus-3d/hv_pointpillars_fpn_sbn-all_4x8_2x_nus-3d_20210826_104936-fca299c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pointpillars/hv_pointpillars_secfpn_sbn-all_2x8_2x_lyft-3d/hv_pointpillars_secfpn_sbn-all_2x8_2x_lyft-3d_20210829_100455-82b81c39.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/pointpillars/hv_pointpillars_fpn_sbn-all_2x8_2x_lyft-3d/hv_pointpillars_fpn_sbn-all_2x8_2x_lyft-3d_20210822_095429-0b3d6196.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointpillars/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-car/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-car_20200901_204315-302fc3e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointpillars/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-3class/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-3class_20200831_204144-d1a706b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointpillars/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-car/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-car_20200901_204315-302fc3e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/pointpillars/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-3class/hv_pointpillars_secfpn_sbn_2x16_2x_waymoD5-3d-3class_20200831_204144-d1a706b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/fp16/hv_pointpillars_secfpn_sbn-all_fp16_2x8_2x_nus-3d/hv_pointpillars_secfpn_sbn-all_fp16_2x8_2x_nus-3d_20201020_222626-c3f0483e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/pointpillars/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/fp16/hv_pointpillars_fpn_sbn-all_fp16_2x8_2x_nus-3d/hv_pointpillars_fpn_sbn-all_fp16_2x8_2x_nus-3d_20201021_120719-269f9dd6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/regnet/hv_pointpillars_regnet-400mf_secfpn_sbn-all_4x8_2x_nus-3d/hv_pointpillars_regnet-400mf_secfpn_sbn-all_4x8_2x_nus-3d_20200620_230334-53044f32.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/regnet/hv_pointpillars_regnet-400mf_fpn_sbn-all_4x8_2x_nus-3d/hv_pointpillars_regnet-400mf_fpn_sbn-all_4x8_2x_nus-3d_20200620_230239-c694dce7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/regnet/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_4x8_2x_nus-3d/hv_pointpillars_regnet-1.6gf_fpn_sbn-all_4x8_2x_nus-3d_20200629_050311-dcd4e090.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/regnet/hv_pointpillars_regnet-400mf_secfpn_sbn-all_2x8_2x_lyft-3d/hv_pointpillars_regnet-400mf_secfpn_sbn-all_2x8_2x_lyft-3d_20210524_092151-42513826.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/regnet/hv_pointpillars_regnet-400mf_fpn_sbn-all_2x8_2x_lyft-3d/hv_pointpillars_regnet-400mf_fpn_sbn-all_2x8_2x_lyft-3d_20210521_115618-823dcf18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/second/metafile.yml | https://www.mdpi.com/1424-8220/18/10/3337 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/second/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/second/hv_second_secfpn_6x8_80e_kitti-3d-car/hv_second_secfpn_6x8_80e_kitti-3d-car_20200620_230238-393f000c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/second/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/second/hv_second_secfpn_6x8_80e_kitti-3d-3class/hv_second_secfpn_6x8_80e_kitti-3d-3class_20210831_022017-ae782e87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/second/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/fp16/hv_second_secfpn_fp16_6x8_80e_kitti-3d-car/hv_second_secfpn_fp16_6x8_80e_kitti-3d-car_20200924_211301-1f5ad833.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/second/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/fp16/hv_second_secfpn_fp16_6x8_80e_kitti-3d-3class/hv_second_secfpn_fp16_6x8_80e_kitti-3d-3class_20200925_110059-05f67bdf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/smoke/metafile.yml | https://arxiv.org/abs/2002.10111 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/smoke/metafile.yml | https://download.openmmlab.com/mmdetection3d/v0.1.0_models/smoke/smoke_dla34_pytorch_dlaneck_gn-all_8x4_6x_kitti-mono3d_20210929_015553-d46d9bb0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/ssn/metafile.yml | https://arxiv.org/abs/2004.02774 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/ssn/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/ssn/hv_ssn_secfpn_sbn-all_2x16_2x_nus-3d/hv_ssn_secfpn_sbn-all_2x16_2x_nus-3d_20210830_101351-51915986.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/ssn/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/ssn/hv_ssn_regnet-400mf_secfpn_sbn-all_2x16_2x_nus-3d/hv_ssn_regnet-400mf_secfpn_sbn-all_2x16_2x_nus-3d_20210829_210615-361e5e04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/ssn/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/ssn/hv_ssn_secfpn_sbn-all_2x16_2x_lyft-3d/hv_ssn_secfpn_sbn-all_2x16_2x_lyft-3d_20210822_134731-46841b41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/ssn/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/ssn/hv_ssn_regnet-400mf_secfpn_sbn-all_1x16_2x_lyft-3d/hv_ssn_regnet-400mf_secfpn_sbn-all_1x16_2x_lyft-3d_20210829_122825-d93475a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/votenet/metafile.yml | https://arxiv.org/abs/1904.09664 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/votenet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/votenet/votenet_16x8_sunrgbd-3d-10class/votenet_16x8_sunrgbd-3d-10class_20210820_162823-bf11f014.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/votenet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/votenet/votenet_8x8_scannet-3d-18class/votenet_8x8_scannet-3d-18class_20210823_234503-cf8134fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/configs/votenet/metafile.yml | https://download.openmmlab.com/mmdetection3d/v1.0.0_models/votenet/votenet_8x8_scannet-3d-18class/votenet_8x8_scannet-3d-18class_20210823_234503-cf8134fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/docker/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | conda地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/docker/Dockerfile | https://mirrors.tuna.tsinghua.edu.cn/anaconda/cloud/pytorch/linux-64/ | 镜像地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${TORCH_VERSION}/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/docker/Dockerfile | https://pypi.tuna.tsinghua.edu.cn/simple | pip源地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BEVDet_for_PyTorch/setup.py | zwwdev@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/autonoumous_driving/BiSeNet_v2/public_address_statement.md b/PyTorch/contrib/autonoumous_driving/BiSeNet_v2/public_address_statement.md index de377f294979515e321b0e6fc588c1a32509fece..859006a2b4e4d2a334456c0f4d5f63058a339729 100644 --- a/PyTorch/contrib/autonoumous_driving/BiSeNet_v2/public_address_statement.md +++ b/PyTorch/contrib/autonoumous_driving/BiSeNet_v2/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/CoinCheung/BiSeNet/blob/master/lib/models/bisenetv2.py | BiSeNet_v2/lib/models/bisenetv2.py | https://github.com/CoinCheung/BiSeNet/releases/download/0.0.0/backbone_v2.pth | 模型bakcbone地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/autonoumous_driving/BiSeNet_v2/lib/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/public_address_statement.md index 607ab3220b73ca229e65d83610f833709fde93bf..f238e84c1d5e7979842086a91b995b061eaccdef 100644 --- a/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/public_address_statement.md @@ -1,5 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|------------------------|--------| -| 开发引入 | / | url.ini | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | AlexNet_ID2663_for_PyTorch/alexnet.py | https://arxiv.org/abs/1404.5997 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlexNet_ID2663_for_PyTorch/url.ini | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/AlignedReID/public_address_statement.md b/PyTorch/contrib/cv/classification/AlignedReID/public_address_statement.md index 5e5c2fe2d31dbb971a704c9c67fb6a0a13cbb9fd..94bae1fa17928fa94d80cd2737116bde8ec67337 100644 --- a/PyTorch/contrib/cv/classification/AlignedReID/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/AlignedReID/public_address_statement.md @@ -1,17 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|---------|------------------------|--------| -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/model/resnet.py | AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/model/resnet.py | AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/model/resnet.py | AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/model/resnet.py | AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/model/resnet.py | AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | AlignedReID/aligned_reid/utils/utils.py | https://stackoverflow.com/a/41733927 | 相关说明 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/README.md | AlignedReID/aligned_reid/utils/metric.py | https://github.com/Cysu/open-reid | 源码实现 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/utils/utils.py | AlignedReID/aligned_reid/utils/utils.py | https://github.com/amdegroot/ssd.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/utils/metric.py | AlignedReID/aligned_reid/utils/metric.py | https://github.com/zhunzhong07/person-re-ranking/ | 源码实现 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/README.md | AlignedReID/aligned_reid/utils/metric.py | http://www.liangzheng.org/Project/project_reid.html | 相关说明 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/README.md | AlignedReID/aligned_reid/utils/re_ranking.py | https://github.com/zhunzhong07/person-re-ranking | 源码实现 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/utils/re_ranking.py | AlignedReID/aligned_reid/utils/re_ranking.py | http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/aligned_reid/utils/utils.py | AlignedReID/aligned_reid/utils/utilsn1.py | https://github.com/amdegroot/ssd.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huanghoujing/AlignedReID-Re-Production-Pytorch/blob/2e2d45450d69a3a81e15d18fe85c2eebbde742e4/README.md | AlignedReID/aligned_reid/model/TripletLoss.py | https://github.com/Cysu/open-reid | 源码实现 | -| 开发引入 | / | AlignedReID/aligned_reid/utils/utilsn1.py | https://stackoverflow.com/a/41733927 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/AlignedReID/aligned_reid/model/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/CSWin-Transformer/public_address_statement.md b/PyTorch/contrib/cv/classification/CSWin-Transformer/public_address_statement.md index 7315cbb4430cc61a3a827104b40d6107856b4a69..acbbcf771b468fa834cb2d2322efd5a5db044c65 100644 --- a/PyTorch/contrib/cv/classification/CSWin-Transformer/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/CSWin-Transformer/public_address_statement.md @@ -1,9 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|-------| -| 开源代码引入 | https://github.com/microsoft/CSWin-Transformer/blob/f111ae2f771df32006e7afd7916835dd67d4cb9d/install_req.sh | CSWin-Transformer/install_req.sh | https://download.pytorch.org/whl/torch_stable.html | 下载三方库 | -| 开源代码引入 | https://github.com/microsoft/CSWin-Transformer/blob/f111ae2f771df32006e7afd7916835dd67d4cb9d/segmentation/install_req.sh | CSWin-Transformer/segmentation/install_req.sh | https://download.pytorch.org/whl/torch_stable.html | 下载三方库 | -| 开发引入 | / | CSWin-Transformer/timm_difference/data/loader.py | https://github.com/NVIDIA/apex/issues/304#issuecomment-493562789 | 相关说明 | -| 开发引入 | / | CSWin-Transformer/timm_difference/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | CSWin-Transformer/timm_difference/data/loader.py | https://github.com/NVIDIA/apex/blob/master/examples/imagenet/main_amp.py | 源码实现 | -| 开发引入 | / | CSWin-Transformer/timm_difference/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开发引入 | / | CSWin-Transformer/timm_difference/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|----------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/CSWin-Transformer/install_req.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/CSWin-Transformer/segmentation/install_req.sh | https://download.pytorch.org/whl/torch_stable.html | 三方库地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Centroids-reid/public_address_statement.md b/PyTorch/contrib/cv/classification/Centroids-reid/public_address_statement.md index 683e6e39cdecb0bee140f82ef79a8840812987e8..58e1166ad7df6dbb8e36e96d9122d0ca2348d00e 100644 --- a/PyTorch/contrib/cv/classification/Centroids-reid/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Centroids-reid/public_address_statement.md @@ -1,78 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|------------------------|--------| -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/modelling/backbones/resnet_ibn_a.py | Centroids-reid/modelling/backbones/resnet_ibn_a.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/modelling/backbones/resnet_ibn_a.py | Centroids-reid/modelling/backbones/resnet_ibn_a.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/modelling/backbones/resnet_ibn_a.py | Centroids-reid/modelling/backbones/resnet_ibn_a.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://pepy.tech/badge/pytorch-lightning | 开源地址 | -| 开发引入 | / | url.ini | waf2107@columbia.edu | 作者邮箱 | -| 开发引入 | / | url.ini | https://github.com/PyTorchLightning/pytorch-lightning | 开源地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/test_tube.py | https://williamfalcon.github.io/test-tube | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/losses/triplet_loss.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/trainer/connectors/precision_connector.py | https://github.com/NVIDIA/apex#linux | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/test_tube.py | https://www.tensorflow.org/tensorboard | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/datasets/dukemtmcreid.py | Centroids-reid/datasets/dukemtmcreid.py | https://github.com/layumi/DukeMTMC-reID_evaluation | 源码实现 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/utils/eval_reid.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/ssim.py | https://en.wikipedia.org/wiki/Structural_similarity | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/setup_tools.py | https://img.shields.io/conda/v/conda-forge/pytorch-lightning?label=conda | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/neptune.py | https://docs.neptune.ai/integrations/pytorch_lightning.html | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/accelerators/dp_accelerator.py | https://github.com/NVIDIA/apex/issues/227 | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/modelling/baseline.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/inference/inference_utils.py | Centroids-reid/inference/inference_utils.py | https://github.com/python-pillow/Pillow/issues/835 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/neptune.py | https://neptune.ai | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/test_tube.py | https://docs.python.org/3/library/pickle.html#handling-stateful-objects | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/functional/precision_recall_curve.py | https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/metrics/_ranking.py | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/neptune.py | https://ui.neptune.ai/o/shared/org/ | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/setup_tools.py | https://anaconda.org/conda-forge/pytorch-lightning | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/accelerators/tpu_accelerator.py | https://github.com/pytorch/xla/blob/master/TROUBLESHOOTING.md | 源码实现 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/inference/inference_utils.py | Centroids-reid/datasets/bases.py | https://github.com/python-pillow/Pillow/issues/835 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/core/lightning.py | https://github.com/pytorch/pytorch/blob/3e6bb5233f9ca2c5aa55d9cda22a7ee85439aa6e/ | 源码实现 | -| 开发引入 | / | Centroids-reid/utils/eval_reid.py | https://en.wikipedia.org/wiki/Evaluation_measures_ | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/datasets/market1501.py | Centroids-reid/datasets/market1501.py | http://www.liangzheng.org/Project/project_reid.html | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/setup_tools.py | https://github.com/Borda/pytorch-lightning/releases/download/1.1.0a6/codecov_badge.png | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/classification/precision_recall.py | https://en.wikipedia.org/wiki/Precision_and_recall | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/utilities/parsing.py | https://docs.python.org/3/library/inspect.html#inspect.Signature.parameters | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/datasets/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/psnr.py | https://en.wikipedia.org/wiki/Mean_squared_error | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/losses/triplet_loss.py | Centroids-reid/losses/triplet_loss.py | https://github.com/Cysu/open-reid | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/classification/f_beta.py | https://en.wikipedia.org/wiki/F-score | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/psnr.py | https://en.wikipedia.org/wiki/Peak_signal-to-noise_ratio | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/datasets/transforms/random_erasing.py | Centroids-reid/datasets/transforms/random_erasing.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/utilities/memory.py | https://github.com/BlackHC/toma/blob/master/toma/torch_cuda_memory.py | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/functional/nlp.py | https://pytorch.org/text/_modules/torchtext/data/metrics.html#bleu_score | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/mean_squared_log_error.py | https://scikit-learn.org/stable/modules/model_evaluation.html#mean-squared-log-error | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/classification/accuracy.py | https://en.wikipedia.org/wiki/Accuracy_and_precision | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/explained_variance.py | https://en.wikipedia.org/wiki/Explained_variation | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/plugins/ddp_sequential_plugin.py | https://arxiv.org/abs/1811.06965 | 论文地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/utilities/cloud_io.py | https://github.com/pytorch/pytorch/issues/42239 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/accelerators/accelerator_connector.py | https://github.com/PyTorchLightning/pytorch-lightning/pull/1572/files#r420279383 | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/setup_tools.py | https://github.com/PyTorchLightning/pytorch-lightning/raw/master/docs/source/_images/lightning_module/pt_to_pl.png | 源码实现 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/modelling/backbones/resnet.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/datasets/transforms/random_erasing.py | Centroids-reid/datasets/transforms/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/setup_tools.py | https://img.shields.io/pypi/pyversions/pytorch-lightning | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/tensorboard.py | https://www.tensorflow.org/tensorboard | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/comet.py | https://www.comet.ml/docs/python-sdk/releases/#release-300 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/utilities/memory.py | https://github.com/pytorch/pytorch/issues/4107 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/accelerators/tpu_accelerator.py | https://github.com/pytorch/xla/blob/master/API_GUIDE.md#saving-and-loading-xla-tensors | 源码实现 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/datasets/transforms/random_erasing.py | Centroids-reid/datasets/dukemtmcreid.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/classification/confusion_matrix.py | https://scikit-learn.org/stable/modules/model_evaluation.html#confusion-matrix | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/__init__.py | https://pytorch-lightning.readthedocs.io/en/stable | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/trainer/training_loop.py | https://discuss.pytorch.org/t/out-of-memory-when-i-use-torch-cuda-empty-cache/57898 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/wandb.py | https://app.wandb.ai/cayush/pytorchlightning/reports/ | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/__init__.py | https://pytorch-lightning.readthedocs.io/en/latest | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/mlflow.py | https://mlflow.org | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/datasets/bases.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/utils/reid_metric.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/mean_absolute_error.py | https://en.wikipedia.org/wiki/Mean_absolute_error | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/core/lightning.py | https://arxiv.org/abs/1704.00028 | 论文地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/__init__.py | https://github.com/pypa/twine/issues/522 | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/utilities/memory.py | https://github.com/BlackHC/toma/blob/master/toma/cpu_memory.py | 源码实现 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/profiler/__init__.py | https://docs.python.org/3/library/profile.html#module-cProfile | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/wandb.py | https://www.wandb.com/ | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/neptune.py | https://docs.neptune.ai/python-api/tutorials/get-started.html#copy-api-token | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/datasets/transforms/random_erasing.py | Centroids-reid/datasets/transforms/build.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/datasets/transforms/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/accelerators/tpu_accelerator.py | https://discuss.pytorch.org/t/segfault-with-multiprocessing-queue/81292/2 | 相关说明 | -| 开源代码引入 | https://github.com/mikwieczorek/centroids-reid/blob/a1825b7a92b2a8d5e223708c7c43ab58a46efbcf/utils/reid_metric.py | Centroids-reid/datasets/market1501.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/metrics/regression/mean_squared_error.py | https://en.wikipedia.org/wiki/Mean_squared_error | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/loggers/comet.py | https://www.comet.ml | 相关说明 | -| 开发引入 | / | Centroids-reid/pytorch_lightning/trainer/connectors/data_connector.py | https://stackoverflow.com/a/1630350","https://stackoverflow.com/a/1630350 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Centroids-reid/datasets/market1501.py | http://www.liangzheng.org/Project/project_reid.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Centroids-reid/modelling/backbones/resnet_ibn_a.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Centroids-reid/modelling/backbones/resnet_ibn_a.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Centroids-reid/modelling/backbones/resnet_ibn_a.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Centroids-reid/url.ini | waf2107@columbia.edu | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Centroids-reid/url.ini | https://pepy.tech/badge/pytorch-lightning | 开源地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Conformer_Ti/public_address_statement.md b/PyTorch/contrib/cv/classification/Conformer_Ti/public_address_statement.md index d068fd2be21fbc9f0db37621acd4a5a0302ca69d..bba674433858501031d520c0c475031abb12f850 100644 --- a/PyTorch/contrib/cv/classification/Conformer_Ti/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Conformer_Ti/public_address_statement.md @@ -1,26 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/models.py | Conformer_Ti/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/models.py | Conformer_Ti/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/models.py | Conformer_Ti/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开发引入 | / | Conformer_Ti/timm_need/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开发引入 | / | Conformer_Ti/timm_need/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/pengzhiliang/Conformer/blob/815aaad3ef5dbdfcf1e11368891416c2d7478cb1/vision_transformer.py | Conformer_Ti/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开发引入 | / | Conformer_Ti/timm_need/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------|-------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Conformer_Ti/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Conformer_Ti/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Conformer_Ti/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ConvNeXt/public_address_statement.md b/PyTorch/contrib/cv/classification/ConvNeXt/public_address_statement.md index fc15135ee1f1e153280147d2596b6652f565cdf4..1fc1dc5c60f046b78343c7d348d0d9d8729d159f 100644 --- a/PyTorch/contrib/cv/classification/ConvNeXt/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ConvNeXt/public_address_statement.md @@ -1,23 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext.py | ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext_isotropic.py | ConvNeXt/models/convnext_isotropic.py | https://dl.fbaipublicfiles.com/convnext/convnext_iso_small_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext_isotropic.py | ConvNeXt/models/convnext_isotropic.py | https://dl.fbaipublicfiles.com/convnext/convnext_iso_base_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/models/convnext_isotropic.py | ConvNeXt/models/convnext_isotropic.py | https://dl.fbaipublicfiles.com/convnext/convnext_iso_large_1k_224_ema.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/semantic_segmentation/backbone/convnext.py | ConvNeXt/models/convnext.py | https://arxiv.org/pdf/2201.03545.pdf | 论文地址 | -| 开发引入 | / | ConvNeXt/timm_need/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开发引入 | / | ConvNeXt/timm_need/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/semantic_segmentation/backbone/convnext.py | ConvNeXt/models/convnext_isotropic.py | https://arxiv.org/pdf/2201.03545.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/optim_factory.py | ConvNeXt/optim_factory.py | https://github.com/microsoft/unilm/blob/master/beit/optim_factory.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/semantic_segmentation/backbone/convnext.py | ConvNeXt/object_detection/mmdet/models/backbones/convnext.py | https://arxiv.org/pdf/2201.03545.pdf | 论文地址 | -| 开发引入 | / | ConvNeXt/timm_need/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/ConvNeXt/blob/main/semantic_segmentation/backbone/convnext.py | ConvNeXt/semantic_segmentation/backbone/convnext.py | https://arxiv.org/pdf/2201.03545.pdf | 论文地址 | -| 开发引入 | / | ConvNeXt/timm_need/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext_isotropic.py | https://dl.fbaipublicfiles.com/convnext/convnext_iso_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext_isotropic.py | https://dl.fbaipublicfiles.com/convnext/convnext_iso_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ConvNeXt/models/convnext_isotropic.py | https://dl.fbaipublicfiles.com/convnext/convnext_iso_base_1k_224_ema.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/DPN-131_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/DPN-131_for_PyTorch/public_address_statement.md index f19b145aaa6a92012508fa0afdfae1e3057401da..98573dac4ec7b395763fe73132c44538e76b3173 100644 --- a/PyTorch/contrib/cv/classification/DPN-131_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/DPN-131_for_PyTorch/public_address_statement.md @@ -1,13 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/dpn68-4af7d88d2.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/dpn68b_extra-363ab9c19.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/dpn92_extra-fda993c95.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/dpn98-722954780.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/dpn131-7af84be88.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/dpn107_extra-b7f9f4cc9.pth | 下载权重文件 | -| 开发引入 | / | DPN-131_for_PyTorch/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained | 源码实现 | -| 开发引入 | / | DPN-131_for_PyTorch/dpn.py | http://data.lip6.fr/cadene/pretrainedmodels/dpn68-66bebafa7.pth | 预训练模型 | -| 开发引入 | / | DPN-131_for_PyTorch/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开发引入 | / | DPN-131_for_PyTorch/dpn.py | https://github.com/cypw/DPNs | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/DPN-131_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/DPN-131_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Deit_Small/public_address_statement.md b/PyTorch/contrib/cv/classification/Deit_Small/public_address_statement.md index 39438b9b3ccabb48f0b17d549c0c428357bb5774..99dfbe3e099d68d54e94bcfff52c5defd9ed8967 100644 --- a/PyTorch/contrib/cv/classification/Deit_Small/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Deit_Small/public_address_statement.md @@ -1,47 +1,21 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|-------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/deit/blob/ae4dba9b453b9e18faa781edbc13039aaeca9b68/models.py | Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_1k_miil_84_4.pth | 下载权重文件 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/Facebookresearch/deit.git/models_v2.py | Deit_Small/models.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开发引入 | / | Deit_Small/npu_fused_adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | Deit_Small/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开发引入 | / | Deit_Small/npu_fused_adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开发引入 | / | Deit_Small/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开发引入 | / | Deit_Small/npu_fused_adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/Facebookresearch/deit.git/losses.py | Deit_Small/losses.py | https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/model/net.py#L100 | 源码实现 | -| 开发引入 | / | Deit_Small/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开发引入 | / | Deit_Small/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Deit_Small/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_1k_miil_84_4.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/DnCNN/public_address_statement.md b/PyTorch/contrib/cv/classification/DnCNN/public_address_statement.md index f1d14f328b8fe9d1136082b76e12adc39828fb65..f50ffcbf3588fa342efdde4ada1bfa71a20bb43e 100644 --- a/PyTorch/contrib/cv/classification/DnCNN/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/DnCNN/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/DnCNN/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/EfficientNet-B0_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/EfficientNet-B0_for_PyTorch/public_address_statement.md index c04e2cffb17bab81167355e7b85e14ce34e7c1c6..4684e25faf4f46ac2f896b94ef026a3e1b1c877b 100644 --- a/PyTorch/contrib/cv/classification/EfficientNet-B0_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/EfficientNet-B0_for_PyTorch/public_address_statement.md @@ -1,24 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|--------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b0-355c32eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b1-f1951068.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b2-8bb594d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b3-5fb5a3c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b4-6ed6700e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b5-b6417697.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b6-c76e70fd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/efficientnet-b7-dcc49843.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b0-b64d5a18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b1-0f3ce85a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b2-6e9d97e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b3-cdd7c0f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b4-44fb3a87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b5-86493f6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b6-ac80338e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b7-4652b6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/utils.py | https://github.com/lukemelas/EfficientNet-PyTorch/releases/download/1.0/adv-efficientnet-b8-22a8fe65.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/setup.py | https://github.com/lukemelas/EfficientNet-PyTorch | 下载源码 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 邮箱 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/setup.py | https://pypi.python.org/pypi?%3Aaction=list_classifiers | 相关依赖 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/efficientnet_pytorch/model.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/lukemelas/EfficientNet-PyTorch.git | EfficientNet_for_PyTorch/hubconf.py | https://arxiv.org/abs/1905.11946 | 论文地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|-----------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B0_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/EfficientNet-B1/public_address_statement.md b/PyTorch/contrib/cv/classification/EfficientNet-B1/public_address_statement.md index 4c0abeb7bcbeaa2017d9f5cb4480046fd316e113..0d7a40899fdec40ee1860840af6bc9c8edc4953c 100644 --- a/PyTorch/contrib/cv/classification/EfficientNet-B1/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/EfficientNet-B1/public_address_statement.md @@ -1,16 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/core/io.py | EfficientNet-B1/pycls/core/io.py | https://dl.fbaipublicfiles.com/pycls | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/models/model_zoo.py | EfficientNet-B1/pycls/models/model_zoo.py | https://dl.fbaipublicfiles.com/pycls | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/models/model_zoo.py | EfficientNet-B1/pycls/models/model_zoo.py | https://raw.githubusercontent.com/facebookresearch/pycls/master/configs | 下载配置文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/datasets/augment.py | EfficientNet-B1/pycls/datasets/augment.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/auto_augment.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/core/io.py | EfficientNet-B1/pycls/core/io.py | https://stackoverflow.com/questions/2028517/python-urllib2-progress-hook | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/core/io.py | EfficientNet-B1/pycls/core/io.py | https://stackoverflow.com/questions/3173320/text-progress-bar-in-the-console/27871113 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/datasets/augment.py | EfficientNet-B1/pycls/datasets/augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/datasets/augment.py | EfficientNet-B1/pycls/datasets/augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开发引入 | / | EfficientNet-B1/pycls/core/sgd.py | http://www.cs.toronto.edu/%7Ehinton/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/core/net.py | EfficientNet-B1/pycls/core/net.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/core/distributed.py | EfficientNet-B1/pycls/core/distributed.py | https://github.com/pytorch/pytorch/issues/37377 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/datasets/augment.py | EfficientNet-B1/pycls/datasets/augment.py | http://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/8c79a8e2adfffa7cae3a88aace28ef45e52aa7e5/pycls/datasets/imagenet.py | EfficientNet-B1/pycls/datasets/imagenet.py | https://github.com/facebookarchive/fb.resnet.torch/blob/master/datasets/imagenet.lua | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B1/pycls/core/io.py | https://dl.fbaipublicfiles.com/pycls | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B1/pycls/models/model_zoo.py | https://dl.fbaipublicfiles.com/pycls | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B1/pycls/models/model_zoo.py | https://raw.githubusercontent.com/facebookresearch/pycls/master/configs | 下载配置文件 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B1/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/EfficientNet-B3/public_address_statement.md b/PyTorch/contrib/cv/classification/EfficientNet-B3/public_address_statement.md index b85486262927ef5ba91c2caa9af4f1435af75700..80b86b2015bb7a58e11d6847f6bb7fb29cbccdee 100644 --- a/PyTorch/contrib/cv/classification/EfficientNet-B3/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/EfficientNet-B3/public_address_statement.md @@ -1,17 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------------------------|-------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/core/io.py | EfficientNet-B3/pycls/core/io.py | https://dl.fbaipublicfiles.com/pycls | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/models/model_zoo.py | EfficientNet-B3/pycls/models/model_zoo.py | https://dl.fbaipublicfiles.com/pycls | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/models/model_zoo.py | EfficientNet-B3/pycls/models/model_zoo.py | https://raw.githubusercontent.com/facebookresearch/pycls/master/configs | 下载配置文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/setup.py | EfficientNet-B3/setup.py | https://github.com/facebookresearch/pycls | 开源地址 | -| 开发引入 | / | EfficientNet-B3/pycls/core/sgd.py | http://www.cs.toronto.edu/%7Ehinton/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/core/io.py | EfficientNet-B3/pycls/core/io.py | https://stackoverflow.com/questions/3173320/text-progress-bar-in-the-console/27871113 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/core/distributed.py | EfficientNet-B3/pycls/core/distributed.py | https://github.com/pytorch/pytorch/issues/37377 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/datasets/augment.py | EfficientNet-B3/pycls/datasets/augment.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/auto_augment.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/datasets/augment.py | EfficientNet-B3/pycls/datasets/augment.py | http://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/datasets/augment.py | EfficientNet-B3/pycls/datasets/augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/core/net.py | EfficientNet-B3/pycls/core/net.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/datasets/augment.py | EfficientNet-B3/pycls/datasets/augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/core/io.py | EfficientNet-B3/pycls/core/io.py | https://stackoverflow.com/questions/2028517/python-urllib2-progress-hook | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/main/pycls/datasets/imagenet.py | EfficientNet-B3/pycls/datasets/imagenet.py | https://github.com/facebookarchive/fb.resnet.torch/blob/master/datasets/imagenet.lua | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B3/pycls/core/io.py | https://dl.fbaipublicfiles.com/pycls | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B3/pycls/models/model_zoo.py | https://dl.fbaipublicfiles.com/pycls | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B3/pycls/models/model_zoo.py | https://raw.githubusercontent.com/facebookresearch/pycls/master/configs | 下载配置文件 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B3/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/public_address_statement.md index 8d0b7a336972efd67f492e5b421e8a40c1769535..5d00de6811745eff02e8c08ca1e31d049e61532f 100644 --- a/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/public_address_statement.md @@ -1,17 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------------------------|-------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/0ddcc2b25607c7144fd6c169d725033b81477223/pycls/core/io.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/core/io.py | https://dl.fbaipublicfiles.com/pycls | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/0ddcc2b25607c7144fd6c169d725033b81477223/pycls/models/model_zoo.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/models/model_zoo.py | https://dl.fbaipublicfiles.com/pycls | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/0ddcc2b25607c7144fd6c169d725033b81477223/pycls/models/model_zoo.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/models/model_zoo.py | https://raw.githubusercontent.com/facebookresearch/pycls/master/configs | 下载配置文件 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/blob/0ddcc2b25607c7144fd6c169d725033b81477223/setup.py | EfficientNet-B5_ID1621_for_PyTorch/setup.py | https://github.com/facebookresearch/pycls | 开源地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/datasets/augment.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/datasets/augment.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/auto_augment.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/datasets/augment.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/datasets/augment.py | http://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/core/net.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/core/net.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/core/io.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/core/io.py | https://stackoverflow.com/questions/2028517/python-urllib2-progress-hook | 相关说明 | -| 开发引入 | / | EfficientNet-B5_ID1621_for_PyTorch/pycls/core/sgd.py | http://www.cs.toronto.edu/%7Ehinton/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/datasets/augment.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/datasets/augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/core/io.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/core/io.py | https://stackoverflow.com/questions/3173320/text-progress-bar-in-the-console/27871113 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/datasets/augment.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/datasets/augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/core/distributed.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/core/distributed.py | https://github.com/pytorch/pytorch/issues/37377 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls.git/pycls/datasets/imagenet.py | EfficientNet-B5_ID1621_for_PyTorch/pycls/datasets/imagenet.py | https://github.com/facebookarchive/fb.resnet.torch/blob/master/datasets/imagenet.lua | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/pycls/core/io.py | https://dl.fbaipublicfiles.com/pycls | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/pycls/models/model_zoo.py | https://dl.fbaipublicfiles.com/pycls | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/pycls/models/model_zoo.py | https://raw.githubusercontent.com/facebookresearch/pycls/master/configs | 下载配置文件 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNet-B5_ID1621_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/public_address_statement.md index 70776fb147505afa86203e6d1af1e7e7c7db8cfb..5ab2c28f6f94d64cfcd8a7ede6315c5e8b67659d 100644 --- a/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/public_address_statement.md @@ -1,102 +1,276 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|--------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/rmsprop_tf.py | ttps://arxiv.org/pdf/1308.0850v5.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/optim_factory.py | https://github.com/microsoft/unilm/blob/master/beit/optim_factory.py#L58 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadamw.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadamw.py | https://arxiv.org/abs/1910.05446 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadamw.py | https://github.com/mlcommons/algorithmic-efficiency/tree/main/baselines/nadamw | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/madgrad.py | https://arxiv.org/abs/2101.11075 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/madgrad.py | https://github.com/facebookresearch/madgrad | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/madgrad.py | https://arxiv.org/abs/2101.11075 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lion.py | http://www.apache.org/licenses/LICENSE-2.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lion.py | https://github.com/google/automl/tree/master/lion | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lion.py | https://arxiv.org/abs/2302.06675 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lars.py | https://github.com/pytorch/pytorch/blob/1.7/torch/optim/sgd.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lars.py | https://github.com/NVIDIA/apex/blob/master/apex/parallel/LARC.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lars.py | https://arxiv.org/pdf/1708.03888.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lars.py | https://github.com/NVIDIA/apex/blob/master/apex/parallel/LARC.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lars.py | https://github.com/pytorch/pytorch/blob/1.7/torch/optim/sgd.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://github.com/pytorch/pytorch/issues/9190 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://openreview.net/forum?id=ryQu7f-RZ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://arxiv.org/abs/1904.00962 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/Transformer-XL/pytorch/lamb.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | http://www.apache.org/licenses/LICENSE-2.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/Transformer-XL/pytorch/lamb.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/lamb.py | https://github.com/HabanaAI/Model-References/blob/2b435114fe8e31f159b1d3063b8280ae37af7423/PyTorch/nlp/bert/pretraining/lamb.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adan.py | https://arxiv.org/abs/2208.06677 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adan.py | https://github.com/sail-sg/Adan | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adan.py | https://arxiv.org/abs/2208.06677 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/layers/norm_act.py | https://github.com/rwightman/pytorch-image-models/issues/1254 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/tf_preprocessing.py | http://www.apache.org/licenses/LICENSE-2.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/real_labels.py | ttps://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/readers/reader_wds.py | https://github.com/webdataset/webdataset | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/readers/reader_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/readers/reader_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/readers/reader_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/readers/reader_tfds.py | https://github.com/tensorflow/datasets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/distributed_sampler.py | https://github.com/facebookresearch/deit/blob/0c4b8f60/samplers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/2204.07118 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 参考论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://github.com/facebookresearch/deit/blob/main/README_revenge.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | train.py | https://github.com/rwightman | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | train.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/beit.py | https://openreview.net/forum?id=p-BhZSz59o4 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_384_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_1k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_xlarge_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_tiny_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_small_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_large_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_22k_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnext_base_1k_224_ema.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_pico_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_nano_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_femto_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_atto_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_nano_22k_384_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_nano_22k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_pico_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_nano_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_femto_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_atto_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_tiny_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_large_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_huge_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/pt_only/convnextv2_base_1k_224_fcmae.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_tiny_22k_384_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_tiny_22k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_large_22k_384_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_large_22k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_huge_22k_512_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_huge_22k_384_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_base_22k_384_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im22k/convnextv2_base_22k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_tiny_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_large_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_huge_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/convnext.py | https://dl.fbaipublicfiles.com/convnext/convnextv2/im1k/convnextv2_base_1k_224_ema.pt | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_small_384_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_small_384_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_small_224_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_small_224_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_medium_224_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_medium_224_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_large_384_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_large_384_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_large_224_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_large_224_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_huge_224_21k_v1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_huge_224_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_base_384_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_base_384_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_base_224_21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/deit.py | https://dl.fbaipublicfiles.com/deit/deit_3_base_224_1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dino.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dino.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_H_in21k.pyth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_S_in1k.pyth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_L_in1k.pyth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_B_in1k.pyth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_T_in1k.pyth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_L_in21k.pyth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/mvitv2.py | https://dl.fbaipublicfiles.com/mvit/mvitv2_models/MViTv2_B_in21k.pyth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_8gf-dc2b1b54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_800mf-58fc7688.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_400mf-e6988f5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_32gf-8db6d4b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_32gf_swag-04fdfa75.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_32gf_lc_swag-e1583746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_3_2gf-9180c971.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_16gf-3e4a00f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_16gf_swag-43afe44d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_16gf_lc_swag-f3ec0043.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_128gf_swag-c8ce3e52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_128gf_lc_swag-cbe8ce12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_y_1_6gf-0d7bc02a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_8gf-2b70d774.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_800mf-94a99ebd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_400mf-62229a5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_32gf-6eb8fdc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_3_2gf-7071aa85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_16gf-ba3796d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://download.pytorch.org/models/regnet_x_1_6gf-a12f2b72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/swav_ig1b_regnet128Gf_cnstant_bs32_node16_sinkhorn10_proto16k_syncBN64_warmup8k/model_final_checkpoint_phase0.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/seer_regnet64/seer_regnet64gf_model_final_checkpoint_phase0.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/seer_regnet32d/seer_regnet32gf_model_iteration244000.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/seer_finetuned/seer_regnet64_finetuned_in1k_model_final_checkpoint_phase78.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/seer_finetuned/seer_regnet32_finetuned_in1k_model_final_checkpoint_phase78.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/seer_finetuned/seer_regnet256_finetuned_in1k_model_final_checkpoint_phase38.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/vissl/model_zoo/seer_finetuned/seer_regnet128_finetuned_in1k_model_final_checkpoint_phase78.torch | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-9ba9bcbe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-d733dc28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-1a0047aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_64x4d-173b62eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-110c445d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-11ad3fa6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-f82ba261.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-cd907fc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth' | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/mae/pretrain/mae_pretrain_vit_large.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/mae/pretrain/mae_pretrain_vit_huge.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/mae/pretrain/mae_pretrain_vit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/ijepa/IN22K-vit.h.14-900e.pth.tar | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/ijepa/IN22K-vit.g.16-600e.pth.tar | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/ijepa/IN1K-vit.h.16-448px-300e.pth.tar | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/ijepa/IN1K-vit.h.14-300e.pth.tar | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vits14/dinov2_vits14_reg4_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vits14/dinov2_vits14_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vitl14/dinov2_vitl14_reg4_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vitl14/dinov2_vitl14_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vitg14/dinov2_vitg14_reg4_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vitg14/dinov2_vitg14_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vitb14/dinov2_vitb14_reg4_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dinov2/dinov2_vitb14/dinov2_vitb14_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase8_pretrain/dino_vitbase8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase16_pretrain/dino_vitbase16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall8_pretrain/dino_deitsmall8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall16_pretrain/dino_deitsmall16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_32.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_16.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i1k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.01-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i1k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_8-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_8-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_light1-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i1k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i1k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.01-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i1k-300ep-lr_0.001-aug_strong2-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i1k-300ep-lr_0.001-aug_strong2-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/vit_b30_i21k_300ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/vit_b16_i21k_300ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_s_i1k_600ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_s_i1k_300ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_s_i1k.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_l_i1k_600ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_l_i1k_300ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_l_i1k.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_b_i21k_300ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_b_i21k_1000ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_b_i1k_600ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_b_i1k_300ep.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/big_vision/flexivit/flexivit_b_i1k.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.01-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.01-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_light0-wd_0.03-do_0.1-sd_0.1--imagenet2012-steps_20k-lr_0.03-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0--imagenet2012-steps_20k-lr_0.03-res_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_l_0b3195.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/vision_transformer_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_b_01ec64.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/EfficientNetV2-B0_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Evo-Levit_256_384/public_address_statement.md b/PyTorch/contrib/cv/classification/Evo-Levit_256_384/public_address_statement.md index be4f589718824e1e0746f5dd538601c0b9944893..6d4ad017949ae4ff5cf706d3c80071ae6b99e3b9 100644 --- a/PyTorch/contrib/cv/classification/Evo-Levit_256_384/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Evo-Levit_256_384/public_address_statement.md @@ -1,14 +1,12 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit.py | Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit.py | Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit.py | Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit.py | Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit.py | Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit_384.py | Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit_384.py | Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit_384.py | Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit_384.py | Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/blob/4c5d9b30b0a3c9b1e7b8687a9490555bd9d714ca/levit/evo_levit_384.py | Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/YifanXu74/Evo-ViT/levit/losses_levit.py | Evo-Levit_256_384/levit/losses_levit.py | https://github.com/peterliht/knowledge-distillation-pytorch/blob/master/model/net.py#L100 | 源码实现 | -| 开发引入 | / | Evo-Levit_256_384/main_levit.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|--------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Evo-Levit_256_384/levit/evo_levit_384.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/FixRes/public_address_statement.md b/PyTorch/contrib/cv/classification/FixRes/public_address_statement.md index 34e4dc52fb07337f585ae42876774a5aebffad01..c5dafd6c657bb33e60a13f575c1cde8c74822e8e 100644 --- a/PyTorch/contrib/cv/classification/FixRes/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/FixRes/public_address_statement.md @@ -1,45 +1,27 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/hubconf.py | FixRes/hubconf.py | https://dl.fbaipublicfiles.com/FixRes_data/FixRes_Pretrained_Models/ResNet50_v2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/hubconf.py | FixRes/hubconf.py | https://dl.fbaipublicfiles.com/FixRes_data/FixRes_Pretrained_Models/ResNet50_CutMix_v2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/hubconf.py | FixRes/hubconf.py | https://dl.fbaipublicfiles.com/FixRes_data/FixRes_Pretrained_Models/ResNext101_32x48d_v2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/pnasnet.py | FixRes/imnet_evaluate/pnasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/Res.py | FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/resnext_wsl.py | FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/resnext_wsl.py | FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/resnext_wsl.py | FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_evaluate/resnext_wsl.py | FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/pnasnet.py | FixRes/imnet_finetune/pnasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/resnext_wsl.py | FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/resnext_wsl.py | FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/resnext_wsl.py | FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/blob/c9be6acc7a6b32f896e62c28a97c20c2348327d3/imnet_finetune/resnext_wsl.py | FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/resnext_wsl.py | FixRes/imnet_finetune/resnext_wsl.py | https://github.com/facebookresearch/WSL-Images/blob/master/hubconf.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_evaluate/train.py | FixRes/imnet_evaluate/train.py | http://code.activestate.com/recipes/393090/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_extract/Res.py | FixRes/imnet_evaluate/Res.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/setup.py | FixRes/hubconf.py | https://arxiv.org/abs/1906.06423 | 论文地址 | -| 开发引入 | / | FixRes/imnet_finetune/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/resnext_wsl.py | FixRes/imnet_evaluate/resnext_wsl.py | https://github.com/facebookresearch/WSL-Images/blob/master/hubconf.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/resnext_wsl.py | FixRes/imnet_evaluate/resnext_wsl.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/hubconf.py | FixRes/hubconf.py | https://pytorch.org/docs/stable/model_zoo.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/resnext_wsl.py | FixRes/imnet_finetune/resnext_wsl.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/Res.py | FixRes/imnet_evaluate/Res.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/pnasnet.py | FixRes/imnet_finetune/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开发引入 | / | FixRes/imnet_evaluate/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/Res.py | FixRes/imnet_finetune/Res.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/imnet_finetune/pnasnet.py | FixRes/imnet_evaluate/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/FixRes/README_FixEfficientNet.md | FixRes/imnet_finetune/train.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开发引入 | / | FixRes/imnet_finetune/train.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/efficientnet.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/hubconf.py | https://dl.fbaipublicfiles.com/FixRes_data/FixRes_Pretrained_Models/ResNext101_32x48d_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/hubconf.py | https://dl.fbaipublicfiles.com/FixRes_data/FixRes_Pretrained_Models/ResNet50_CutMix_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/hubconf.py | https://dl.fbaipublicfiles.com/FixRes_data/FixRes_Pretrained_Models/ResNet50_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/Res.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_evaluate/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/Res.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/FixRes/imnet_finetune/resnext_wsl.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/GENet_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/classification/GENet_for_Pytorch/public_address_statement.md index ababd17f9065d263489ce3b8c504a175aab4aedd..53c6d48be5222cc1aef10a081a7e3dc36e32454a 100644 --- a/PyTorch/contrib/cv/classification/GENet_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/GENet_for_Pytorch/public_address_statement.md @@ -1,3 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------------------------|-------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://i.loli.net/2021/09/01/aOH5YWoq1A4829D.png | 下载测试图片 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------|---------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GENet_for_Pytorch/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GENet_for_Pytorch/url.ini | https://i.loli.net/2021/09/01/aOH5YWoq1A4829D.png | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/GhostNet/public_address_statement.md b/PyTorch/contrib/cv/classification/GhostNet/public_address_statement.md index 22e901b0dfcb21fdc8f53a0986199917af76dd14..fbd32493dcb52a89c9ffcc2dbf9a54607fe91b0b 100644 --- a/PyTorch/contrib/cv/classification/GhostNet/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/GhostNet/public_address_statement.md @@ -1,572 +1,84 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开发引入 | / | GhostNet/setup.py | hello@rwightman.com | 邮箱地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/THIRD PARTY OPEN SOURCE SOFTWARE NOTICE.txt | GhostNet/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/train_ghostnet_8p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | GhostNet/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开发引入 | / | GhostNet/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 预训练模型 | -| 开发引入 | / | GhostNet/ghostnet/ghostnet_pytorch/ghostnet.py | https://github.com/d-li14/mobilenetv3.pytorch","https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/modelarts/train_start.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini| https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/models/wavemlp.py | GhostNet/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开发引入 | / | GhostNet/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 数据集地址 | -| 开发引入 | / | GhostNet/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 数据集地址 | -| 开发引入 | / | GhostNet/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 相关说明 | -| 开发引入 | / | GhostNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/models/wavemlp.py | GhostNet/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开发引入 | / | GhostNet/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/train_ghostnet_1p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x4.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | GhostNet/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/train_ghostnet_1p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开发引入 | / | GhostNet/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/densenet201-c1103571.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/modelarts/train_start.py | https://github.com/rwightman | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x2.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/npu_fused_sgd.py | http://www.cs.toronto.edu/%7Ehinton/absps/momentum.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | license地址 | -| 开发引入 | / | GhostNet/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R101x3.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/train_ghostnet_8p.py | https://github.com/rwightman | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开发引入 | / | GhostNet/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开发引入 | / | GhostNet/infer/data/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb3a | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开发引入 | / | GhostNet/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开发引入 | / | GhostNet/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/train_ghostnet_8p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/cmt_pytorch/cmt.py | GhostNet/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/densenet121-a639ec97.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x1.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/validate_ghostnet.py | https://github.com/rwightman | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/models/wavemlp.py | GhostNet/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/models/wavemlp.py | GhostNet/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开发引入 | / | GhostNet/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/THIRD PARTY OPEN SOURCE SOFTWARE NOTICE.txt | GhostNet/setup.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/data/myloader.py | GhostNet/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开发引入 | / | GhostNet/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/selecsls.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/models/wavemlp.py | GhostNet/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开发引入 | / | GhostNet/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/train_ghostnet_1p.py | https://github.com/rwightman | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/se.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_80_8-dbc13962.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开发引入 | / | GhostNet/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | GhostNet/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/THIRD PARTY OPEN SOURCE SOFTWARE NOTICE.txt | GhostNet/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开发引入 | / | GhostNet/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开发引入 | / | GhostNet/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开发引入 | / | GhostNet/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开发引入 | / | GhostNet/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 相关说明 | -| 开发引入 | / | GhostNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_160-d64013cd.pth | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开发引入 | / | GhostNet/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x3.npz | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://download.pytorch.org/models/densenet161-8d451a50.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/versatile_filters/vcnn.py | GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 源码实现 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/ghostnet_tensorflow/readme.md | GhostNet/ghostnet/ghostnet_pytorch/ghostnet.py | https://arxiv.org/abs/1911.11907 | 论文地址 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/ghostnet_pytorch/ghostnet.py | GhostNet/ghostnet/ghostnet_pytorch/ghostnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | GhostNet/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huawei-noah/CV-Backbones/tree/master/wavemlp_pytorch/train.py | GhostNet/ghostnet/ghostnet_pytorch/ghostnet.py | foss@huawei.com | 邮箱地址 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 预训练模型 | -| 开发引入 | / | GhostNet/url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 预训练模型 | -| 开发引入 | / | GhostNet/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开发引入 | / | GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 预训练模型 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GhostNet/url.ini | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/public_address_statement.md index 3aaa17bc99a9a291e197a083fed35b333a70d729..8a602efaec423d4e54e7235983ec14e938703df0 100644 --- a/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/public_address_statement.md @@ -1,6 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/googlenet-1378be20.pth | 下载权重文件 | -| 开发引入 | / | GoogleNet_ID1623_for_PyTorch/googlenet.py | http://arxiv.org/abs/1409.4842 | 论文地址 | -| 开发引入 | / | GoogleNet_ID1623_for_PyTorch/googlenet.py | https://github.com/pytorch/vision/issues/906 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/main-8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/GoogleNet_ID1623_for_PyTorch/url.ini | https://download.pytorch.org/models/googlenet-1378be20.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/HRNet_ID1780_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/HRNet_ID1780_for_PyTorch/public_address_statement.md index e702872ea2c1ed32ef72a92ad2d29acffdca35dc..90273d9338c3392ca25b5a93427f8632d1f261d6 100644 --- a/PyTorch/contrib/cv/classification/HRNet_ID1780_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/HRNet_ID1780_for_PyTorch/public_address_statement.md @@ -1,21 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/config/models.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/config/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/modelarts/_init_paths.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/config/__init__.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/utils/modelsummary.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/utils/modelsummary.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/config/default.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/tools/_init_paths.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/core/evaluate.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/config/default.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/utils/utils.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/utils/utils.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/models/cls_hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/core/function.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/config/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/valid.py | HRNet_ID1780_for_PyTorch/lib/models/__init__.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/models/cls_hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Image-Classification/tools/_init_paths.py | HRNet_ID1780_for_PyTorch/lib/config/models.py | Bin.Xiao@microsoft.com | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/HRNet_ID1780_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/public_address_statement.md index 2dd984515f98668abee97864a493fd40a8760181..c9810dc63a70f102f1ae7804aaf2128b9cbfbb97 100644 --- a/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/public_address_statement.md @@ -1,6 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/blob/8aae3d8f1135b6b13fed79c1d431e3449fdbf6e0/pretrainedmodels/models/inceptionresnetv2.py | InceptionResNetV2_ID1779_for_PyTorch/inceptionresnetv2.py | http://data.lip6.fr/cadene/pretrainedmodels/inceptionresnetv2-520b38e4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://data.lip6.fr/cadene/pretrainedmodels/inceptionresnetv2-520b38e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/blob/8aae3d8f1135b6b13fed79c1d431e3449fdbf6e0/pretrainedmodels/models/inceptionresnetv2.py | InceptionResNetV2_ID1779_for_PyTorch/inceptionresnetv2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/train_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/train_inceptionresnetv2_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionResNetV2_ID1779_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/public_address_statement.md index 7cacf44dfed613aa6d6908cae5cf1cfcebc2ab58..e08beace9056c8b3c116e7b84c9a08522f24f537 100644 --- a/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/public_address_statement.md @@ -1,5 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开发引入 | / | InceptionV3_ID1596_for_PyTorch/inception.py | http://arxiv.org/abs/1512.00567 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV3_ID1596_for_PyTorch/url.ini | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/public_address_statement.md index de907e07202340a7585e65174094a6a5be38c1ed..c7ca1eb241b421d66c41c590f243112e457f92ee 100644 --- a/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/public_address_statement.md @@ -1,7 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|-----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/blob/8aae3d8f1135b6b13fed79c1d431e3449fdbf6e0/pretrainedmodels/models/inceptionv4.py | InceptionV4_ID1778_for_PyTorch/inceptionv4.py | http://data.lip6.fr/cadene/pretrainedmodels/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/blob/8aae3d8f1135b6b13fed79c1d431e3449fdbf6e0/pretrainedmodels/models/inceptionv4.py | InceptionV4_ID1778_for_PyTorch/inceptionv4_v2.py | http://data.lip6.fr/cadene/pretrainedmodels/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开发引入 | / | InceptionV4_ID1778_for_PyTorch/infer/data/imagenet1000_clsidx_to_labels.names | https://gist.github.com/yrevar/942d3a0ac09ec9e5eb3a | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/train_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/train_inceptionv4_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/InceptionV4_ID1778_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/LVVIT/public_address_statement.md b/PyTorch/contrib/cv/classification/LVVIT/public_address_statement.md index 7f2c9977b38663cb7aafb9b4afd2acb814486a2e..b5fc5b51e0b43705e36fca9198da0a74d90f6420 100644 --- a/PyTorch/contrib/cv/classification/LVVIT/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/LVVIT/public_address_statement.md @@ -1,20 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/setup.py | LVVIT/setup.py | jzh0103@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/setup.py | LVVIT/setup.py | https://github.com/zihangJiang/TokenLabeling | 开源地址 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/visualize/baselines/ViT/LVViT_LRP.py | LVVIT/visualize/baselines/ViT/LVViT_LRP.py | https://github.com/zihangJiang/TokenLabeling/releases/download/1.0/lvvit_m-56M-224-84.0.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/visualize/baselines/ViT/LVViT_LRP.py | LVVIT/visualize/baselines/ViT/LVViT_LRP.py | https://github.com/zihangJiang/TokenLabeling/releases/download/1.0/lvvit_s-26M-224-83.3.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/visualize/baselines/ViT/LVViT_LRP.py | LVVIT/visualize/baselines/ViT/LVViT_LRP.py | https://github.com/zihangJiang/TokenLabeling/releases/download/1.0/lvvit_s-26M-384-84.4.pth.tar | 下载权重文件 | -| 开发引入 | / | LVVIT/tlt/data/loader.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/loader.py | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/tlt/data/dataset.py | LVVIT/tlt/data/dataset.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/validate.py | LVVIT/generate_label.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/validate.py | LVVIT/validate.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开发引入 | / | LVVIT/seg/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/tlt/data/mixup.py | LVVIT/tlt/data/mixup.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/mixup.py | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/seg/mmseg/models/backbones/vit.py | LVVIT/seg/mmseg/models/backbones/vit.py | https://github.com/open-mmlab/mmsegmentation/blob/master/mmseg/models/backbones/vit.py | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/seg/mmseg/models/backbones/vit.py | LVVIT/seg/mmseg/models/backbones/vit.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/tlt/models/layers.py | LVVIT/tlt/models/layers.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/tlt/data/mixup.py | LVVIT/tlt/data/mixup.py | https://github.com/naver-ai/relabel_imagenet/blob/main/utils/relabel_functions.py | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/tlt/utils/utils.py | LVVIT/tlt/utils/utils.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/zihangJiang/TokenLabeling/blob/main/visualize/baselines/ViT/LVViT_LRP.py | LVVIT/visualize/baselines/ViT/LVViT_LRP.py | https://github.com/hila-chefer/Transformer-Explainability/blob/main/baselines/ViT/ViT_LRP.py | 源码实现 | -| 开发引入 | / | LVVIT/tlt/data/random_augment_label.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/auto_augment.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------|-------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/LVVIT/setup.py | jzh0103@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/MGN/README_raw.md b/PyTorch/contrib/cv/classification/MGN/README_raw.md index ac024bb46ce04e4a8745bf81cc14cda65f2b755b..663dd8abe0da47b3d6567472cf7e9c3b8e93f899 100644 --- a/PyTorch/contrib/cv/classification/MGN/README_raw.md +++ b/PyTorch/contrib/cv/classification/MGN/README_raw.md @@ -50,7 +50,7 @@ NOTICE:You need to change num_classes in network depend on how many people in yo ## Weights Pretrained weight download from [google drive](https://drive.google.com/open?id=16V7ZsflBbINHPjh_UVYGBVO6NuSxEMTi) -or [baidu drive](https://pan.baidu.com/s/12AkumLX10hLx9vh_SQwdyw) password:mrl5 + ## Train You can specify more parameters in opt.py diff --git a/PyTorch/contrib/cv/classification/MGN/public_address_statement.md b/PyTorch/contrib/cv/classification/MGN/public_address_statement.md index 8774e480d8af18057cb1f3c9e6b28edc25a51cda..be740f371120f356e1e4357677b0c8be94eb5f93 100644 --- a/PyTorch/contrib/cv/classification/MGN/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/MGN/public_address_statement.md @@ -1,6 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开源代码引入 | https://github.com/GNAYUOHZ/ReID-MGN.git/utils/RandomErasing.py | MGN/utils/RandomErasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/GNAYUOHZ/ReID-MGN.git/utils/metrics.py | MGN/utils/metrics.py | http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/GNAYUOHZ/ReID-MGN.git/utils/TripletLoss.py | MGN/utils/TripletLoss.py | https://github.com/Cysu/open-reid/blob/master/reid/loss/triplet.py | 源码实现 | -| 开源代码引入 | https://github.com/GNAYUOHZ/ReID-MGN.git/utils/metrics.py | MGN/utils/metrics.py | https://github.com/zhunzhong07/person-re-ranking | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MGN/utils/metrics.py | http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf | 论文地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/MnasNet/public_address_statement.md b/PyTorch/contrib/cv/classification/MnasNet/public_address_statement.md index 1e5f9aa1febc0c0e72500575f4536a418b4f088a..fa21e1fb2f5273bfb6a12c14e31cd6ba3daca047 100644 --- a/PyTorch/contrib/cv/classification/MnasNet/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/MnasNet/public_address_statement.md @@ -1,8 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|---------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/91e03b91fd9bab19b4c295692455a1883831a932/torchvision/models/mnasnet.py | MnasNet/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.823-3ffadce67e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/pytorch/vision/blob/91e03b91fd9bab19b4c295692455a1883831a932/torchvision/models/mnasnet.py | MnasNet/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 下载权重文件 | -| 开发引入 | / | MnasNet/train.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | MnasNet/mnasnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开发引入 | / | MnasNet/modelArts/train-modelarts.py | https://www.github.com/nvidia/apex | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MnasNet/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MnasNet/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.823-3ffadce67e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MnasNet/modelArts/train-modelarts.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MnasNet/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MnasNet/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/MobileNet/public_address_statement.md b/PyTorch/contrib/cv/classification/MobileNet/public_address_statement.md index b54ee02a8406ca860b49bfaa2f96bd0130db2713..953223893d33792fb45f5d1adba6bd1b4de5492c 100644 --- a/PyTorch/contrib/cv/classification/MobileNet/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/MobileNet/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|---------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNet/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..1555ca6a4a26c0fce0a32737906eb9e9b7b3a5b3 --- /dev/null +++ b/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/public_address_statement.md @@ -0,0 +1,84 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0535.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x7.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/MobileNetV3_large_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Moco-v2/public_address_statement.md b/PyTorch/contrib/cv/classification/Moco-v2/public_address_statement.md index cef025b80773059ec7ec0ede777f9ce885b126fd..dcc79a6ee0b302cf19fffeeaf7763d403787f23d 100644 --- a/PyTorch/contrib/cv/classification/Moco-v2/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Moco-v2/public_address_statement.md @@ -1,21 +1,13 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|---------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/moco/main_moco.py | Moco-v2/main_moco.py | https://arxiv.org/abs/1805.01978 | 论文地址 | -| 开发引入 | / | Moco-v2/resnet.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/moco/moco/loader.py | Moco-v2/main_moco.py | https://arxiv.org/abs/2002.05709 | 论文地址 | -| 开发引入 | / | Moco-v2/resnet.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开发引入 | / | Moco-v2/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | Moco-v2/resnet.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/moco/README.md | Moco-v2/moco/builder.py | https://arxiv.org/abs/1911.05722 | 论文地址 | -| 开发引入 | / | Moco-v2/resnet.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开发引入 | / | Moco-v2/resnet.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/moco/moco/loader.py | Moco-v2/moco/loader.py | https://arxiv.org/abs/2002.05709 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------|-------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/main_lincls.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/main_moco.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Moco-v2/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/NASNet-A-Mobile/public_address_statement.md b/PyTorch/contrib/cv/classification/NASNet-A-Mobile/public_address_statement.md index 6e27e60a1b555ebb423c3923a94cdc5615789350..8524ea00f007bb8021d32a7310880280c390ee88 100644 --- a/PyTorch/contrib/cv/classification/NASNet-A-Mobile/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/NASNet-A-Mobile/public_address_statement.md @@ -1,8 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|---------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/nasnet_mobile.py | NASNet-A-Mobile/models/nasnet_mobile.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetamobile-7e03cead.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/pretrainedmodels/models/nasnet_mobile.py | NASNet-A-Mobile/models/nasnet_mobile.py | https://arxiv.org/abs/1707.07012 | 论文地址 | -| 开发引入 | / | NASNet-A-Mobile/models/nasnet_mobile.py | https://github.com/DagnyT | 源码实现 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/pretrainedmodels/utils.py | NASNet-A-Mobile/models/utils.py | https://github.com/tensorflow/models/blob/master/research/inception/inception/image_processing.py#L294 | 源码实现 | -| 开发引入 | / | NASNet-A-Mobile/models/nasnet_mobile.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/NASNet-A-Mobile/main_npu_1p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/NASNet-A-Mobile/main_npu_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/NASNet-A-Mobile/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/OSNet/public_address_statement.md b/PyTorch/contrib/cv/classification/OSNet/public_address_statement.md index 6101b538d600ab8eeebbe6ac495ea5bf4be17187..e300dd571d07b7628afa32a176e48d364841a647 100644 --- a/PyTorch/contrib/cv/classification/OSNet/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/OSNet/public_address_statement.md @@ -1,111 +1,39 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|------| -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/setup.py | OSNet/setup.py | https://github.com/KaiyangZhou/deep-person-reid | 开源地址 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/__init__.py | OSNet/torchreid/__init__.py | https://kaiyangzhou.github.io/ | 开源地址 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/__init__.py | OSNet/torchreid/__init__.py | https://github.com/KaiyangZhou/deep-person-reid | 开源地址 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/densenet.py | OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/densenet.py | OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/densenet.py | OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/densenet.py | OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/inceptionresnetv2.py | OSNet/torchreid/models/inceptionresnetv2.py | http://data.lip6.fr/cadene/pretrainedmodels/inceptionresnetv2-520b38e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/inceptionv4.py | OSNet/torchreid/models/inceptionv4.py | http://data.lip6.fr/cadene/pretrainedmodels/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/mlfn.py | OSNet/torchreid/models/mlfn.py | https://mega.nz/#!YHxAhaxC!yu9E6zWl0x5zscSouTdbZu8gdFFytDdl-RAdD2DEfpk | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/mobilenetv2.py | OSNet/torchreid/models/mobilenetv2.py | https://mega.nz/#!NKp2wAIA!1NH1pbNzY_M2hVk_hdsxNM1NUOWvvGPHhaNr-fASF6c | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/mobilenetv2.py | OSNet/torchreid/models/mobilenetv2.py | https://mega.nz/#!RGhgEIwS!xN2s2ZdyqI6vQ3EwgmRXLEW3khr9tpXg96G9SUJugGk | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/nasnet.py | OSNet/torchreid/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetamobile-7e03cead.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/osnet.py | OSNet/torchreid/models/osnet.py | https://drive.google.com/uc?id=1LaG1EJpHrxdAxKnSCJ_i0u-nbxSAeiFY | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/osnet.py | OSNet/torchreid/models/osnet.py | https://drive.google.com/uc?id=1uwA9fElHOk3ZogwbeY5GkLI6QPTX70Hq | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/osnet.py | OSNet/torchreid/models/osnet.py | https://drive.google.com/uc?id=16DGLbZukvVYgINws8u8deSaOqjybZ83i | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/osnet.py | OSNet/torchreid/models/osnet.py | https://drive.google.com/uc?id=1rb8UN5ZzPKRc_xvtHlyDh-cSz88YX9hs | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/osnet.py | OSNet/torchreid/models/osnet.py | https://drive.google.com/uc?id=1sr90V6irlYYDd4_4ISU2iruoRG8J__6l | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/osnet_ain.py | OSNet/torchreid/models/osnet_ain.py | https://drive.google.com/uc?id=1-CaioD9NaqbHK_kzSMW8VE4_3KcsRjEo | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/pcb.py | OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/pcb.py | OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/pcb.py | OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/pcb.py | OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/pcb.py | OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnetmid.py | OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnetmid.py | OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnetmid.py | OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnetmid.py | OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnetmid.py | OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet_ibn_a.py | OSNet/torchreid/models/resnet_ibn_a.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet_ibn_a.py | OSNet/torchreid/models/resnet_ibn_a.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet_ibn_a.py | OSNet/torchreid/models/resnet_ibn_a.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet_ibn_b.py | OSNet/torchreid/models/resnet_ibn_b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet_ibn_b.py | OSNet/torchreid/models/resnet_ibn_b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/resnet_ibn_b.py | OSNet/torchreid/models/resnet_ibn_b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/shufflenet.py | OSNet/torchreid/models/shufflenet.py | https://mega.nz/#!RDpUlQCY!tr_5xBEkelzDjveIYBBcGcovNCOrgfiJO9kiidz9fZM | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/shufflenetv2.py | OSNet/torchreid/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/shufflenetv2.py | OSNet/torchreid/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/squeezenet.py | OSNet/torchreid/models/squeezenet.py | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/squeezenet.py | OSNet/torchreid/models/squeezenet.py | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/blob/master/torchreid/models/xception.py | OSNet/torchreid/models/xception.py | http://data.lip6.fr/cadene/pretrainedmodels/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/university1652.py | OSNet/torchreid/data/datasets/image/university1652.py | https://studentutsedu-my.sharepoint.com/:u:/g/personal/12639605_student_uts_edu_au/Ecrz6xK-PcdCjFdpNb0T0s8B_9J5ynaUy3q63_XumjJyrA?e=z4hpcz | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/losses/hard_mine_triplet_loss.py | OSNet/torchreid/losses/hard_mine_triplet_loss.py | https://github.com/Cysu/open-reid/blob/master/reid/loss/triplet.py | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/utils/rerank.py | OSNet/torchreid/utils/rerank.py | http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/resnet_ibn_b.py | OSNet/torchreid/models/resnet_ibn_a.py | https://github.com/XingangPan/IBN-Net | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/utils/loggers.py | OSNet/torchreid/utils/loggers.py | https://github.com/Cysu/open-reid/blob/master/reid/utils/logging.py | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/viper.py | OSNet/torchreid/data/datasets/image/viper.py | https://vision.soe.ucsc.edu/node/178 | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/nasnet.py | OSNet/torchreid/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 预训练模型 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/resnet.py | OSNet/torchreid/models/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | OSNet/torchreid/data/datasets/image/cuhk03.py | http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html# | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/grid.py | OSNet/torchreid/data/datasets/image/grid.py | http://personal.ie.cuhk.edu.hk/~ccloy/downloads_qmul_underground_reid.html | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/squeezenet.py | OSNet/torchreid/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/university1652.py | OSNet/torchreid/data/datasets/image/university1652.py | https://drive.google.com/file/d/1iVnP4gjw-iHXa0KerZQ1IfIO0i1jADsR/view?usp=sharing | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/grid.py | OSNet/torchreid/data/datasets/image/grid.py | http://personal.ie.cuhk.edu.hk/~ccloy/files/datasets/underground_reid.zip | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/optim/radam.py | OSNet/torchreid/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/ilids.py | OSNet/torchreid/data/datasets/image/ilids.py | http://www.eecs.qmul.ac.uk/~jason/data/i-LIDS_Pedestrian.tgz | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/senet.py | OSNet/torchreid/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/metrics/rank_cylib/rank_cy.pyx | OSNet/torchreid/metrics/rank_cylib/rank_cy.pyx | https://cython.readthedocs.io/en/latest/src/userguide/numpy_tutorial.html | 相关说明 | -| 开发引入 | / | OSNet/torchreid/models/nasnet.py | https://github.com/DagnyT | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/nasnet.py | OSNet/torchreid/models/nasnet.py | https://github.com/veronikayurchuk/pretrained-models.pytorch/releases/download/v1.0/nasnetmobile-7e03cead.pth.tar | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/dukemtmcreid.py | OSNet/torchreid/data/datasets/image/dukemtmcreid.py | http://vision.cs.duke.edu/DukeMTMC/data/misc/DukeMTMC-reID.zip | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/senet.py | OSNet/torchreid/models/inceptionresnetv2.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/transforms.py | OSNet/torchreid/data/transforms.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/cuhksysu.py | OSNet/torchreid/data/datasets/image/cuhksysu.py | http://www.ee.cuhk.edu.hk/~xgwang/PS/dataset.html | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/mars.py | OSNet/torchreid/data/datasets/video/mars.py | http://www.liangzheng.com.cn/Project/project_mars.html | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/utils/avgmeter.py | OSNet/torchreid/utils/avgmeter.py | https://github.com/KaiyangZhou/Dassl.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/senet.py | OSNet/torchreid/models/inceptionv4.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/resnet_ibn_b.py | OSNet/torchreid/models/resnet_ibn_b.py | https://github.com/XingangPan/IBN-Net | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/ilidsvid.py | OSNet/torchreid/data/datasets/video/ilidsvid.py | http://www.eecs.qmul.ac.uk/~xiatian/iLIDS-VID/iLIDS-VID.tar | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/cuhk03.py | OSNet/torchreid/data/datasets/image/cuhk02.py | http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/market1501.py | OSNet/torchreid/data/datasets/image/market1501.py | http://www.liangzheng.org/Project/project_reid.html | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/engine/engine.py | OSNet/torchreid/engine/engine.py | https://arxiv.org/abs/1611.05244 | 论文地址 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/metrics/rank_cylib/rank_cy.pyx | OSNet/torchreid/metrics/rank_cylib/rank_cy.pyx | https://github.com/cython/cython/wiki/enhancements-compilerdirectives | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/prid2011.py | OSNet/torchreid/data/datasets/video/prid2011.py | https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/ | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/dukemtmcvidreid.py | OSNet/torchreid/data/datasets/video/dukemtmcvidreid.py | http://vision.cs.duke.edu/DukeMTMC/data/misc/DukeMTMC-VideoReID.zip | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/prid2011.py | OSNet/torchreid/data/datasets/image/prid.py | https://www.tugraz.at/institute/icg/research/team-bischof/lrs/downloads/PRID11/ | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/dukemtmcvidreid.py | OSNet/torchreid/data/datasets/video/dukemtmcvidreid.py | https://github.com/Yu-Wu/DukeMTMC-VideoReID | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/university1652.py | OSNet/torchreid/data/datasets/image/university1652.py | https://github.com/layumi/University1652-Baseline | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/metrics/rank_cylib/rank_cy.pyx | OSNet/torchreid/metrics/rank_cylib/rank_cy.pyx | https://github.com/luzai | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/squeezenet.py | OSNet/torchreid/models/shufflenetv2.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/squeezenet.py | OSNet/torchreid/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | OSNet/torchreid/metrics/rank.py | https://en.wikipedia.org/wiki/Evaluation_measures_ | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/optim/radam.py | OSNet/torchreid/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/nasnet.py | OSNet/torchreid/models/nasnet.py | https://arxiv.org/abs/1707.07012 | 论文地址 | -| 开发引入 | / | OSNet/torchreid/data/datasets/image/university1652.py | https://pan.baidu.com/s/1H_wBnWwikKbaBY1pMPjoqQ | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/university1652.py | OSNet/torchreid/data/datasets/image/university1652.py | https://drive.google.com/uc?id=1iVnP4gjw-iHXa0KerZQ1IfIO0i1jADsR | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/cuhk03.py | OSNet/torchreid/data/datasets/image/cuhk01.py | http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/dukemtmcreid.py | OSNet/torchreid/data/datasets/image/dukemtmcreid.py | https://github.com/layumi/DukeMTMC-reID_evaluation | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/viper.py | OSNet/torchreid/data/datasets/image/viper.py | http://users.soe.ucsc.edu/~manduchi/VIPeR.v1.0.zip | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/video/ilidsvid.py | OSNet/torchreid/data/datasets/video/ilidsvid.py | http://www.eecs.qmul.ac.uk/~xiatian/downloads_qmul_iLIDS-VID_ReID_dataset.html | 下载链接 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/squeezenet.py | OSNet/torchreid/models/squeezenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/msmt17.py | OSNet/torchreid/data/datasets/image/msmt17.py | http://www.pkuvmc.com/publications/msmt17.html | 相关说明 | -| 开发引入 | / | OSNet/torchreid/metrics/rank_cylib/rank_cy.pyx | https://en.wikipedia.org/wiki/Evaluation_measures_ | 相关说明 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/utils/rerank.py | OSNet/torchreid/utils/rerank.py | https://github.com/zhunzhong07/person-re-ranking | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/models/senet.py | OSNet/torchreid/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/KaiyangZhou/deep-person-reid/torchreid/data/datasets/image/sensereid.py | OSNet/torchreid/data/datasets/image/sensereid.py | https://drive.google.com/file/d/0B56OfSrVI8hubVJLTzkwV2VaOWM/view | 下载链接 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/data/datasets/image/dukemtmcreid.py | http://vision.cs.duke.edu/DukeMTMC/data/misc/DukeMTMC-reID.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/data/datasets/image/ilids.py | http://www.eecs.qmul.ac.uk/~jason/data/i-LIDS_Pedestrian.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/data/datasets/image/viper.py | http://users.soe.ucsc.edu/~manduchi/VIPeR.v1.0.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/data/datasets/video/dukemtmcvidreid.py | http://vision.cs.duke.edu/DukeMTMC/data/misc/DukeMTMC-VideoReID.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/data/datasets/video/ilidsvid.py | http://www.eecs.qmul.ac.uk/~xiatian/iLIDS-VID/iLIDS-VID.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/metrics/rank_cylib/rank_cy.pyx | https://cython.readthedocs.io/en/latest/src/userguide/numpy_tutorial.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/pcb.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet_ibn_a.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet_ibn_a.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet_ibn_a.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet_ibn_b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet_ibn_b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnet_ibn_b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/resnetmid.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/squeezenet.py | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/OSNet/torchreid/models/squeezenet.py | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/PCB/README_raw.md b/PyTorch/contrib/cv/classification/PCB/README_raw.md index fb1ecd0cbc8ea48dd783d712cc9b6ad8d5548b0b..af5226f8c19c851df7aaa9d10e71d2adef2f2411 100644 --- a/PyTorch/contrib/cv/classification/PCB/README_raw.md +++ b/PyTorch/contrib/cv/classification/PCB/README_raw.md @@ -12,9 +12,8 @@ Code for the paper [Beyond Part Models: Person Retrieval with Refined Part Pooli 1. Install [Pytorch](https://pytorch.org/) 2. Download dataset - a. Market-1501 [BaiduYun](https://pan.baidu.com/s/1ntIi2Op?errno=0&errmsg=Auth%20Login%20Sucess&&bduss=&ssnerror=0&traceid=) - b. DukeMTMC-reID[BaiduYun](https://pan.baidu.com/share/init?surl=jS0XM7Var5nQGcbf9xUztw) (password:bhbh) - c. Move them to ```~/datasets/Market-1501/(DukeMTMC-reID)``` + a. Prepare `Market-1501` and `DukeMTMC-reID` + b. Move them to ```~/datasets/Market-1501/(DukeMTMC-reID)``` ## train PCB diff --git a/PyTorch/contrib/cv/classification/PCB/public_address_statement.md b/PyTorch/contrib/cv/classification/PCB/public_address_statement.md index 32c490a018ddf32f2fe20bb4c811410488402fa9..851f34c6f8d6796f296773de4e12e0d06b51353a 100644 --- a/PyTorch/contrib/cv/classification/PCB/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/PCB/public_address_statement.md @@ -1,4 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|------| -| 开源代码引入 | https://github.com/syfafterzy/PCB_RPP_for_reID/blob/master/reid/utils/progress/setup.py | PCB/reid/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/syfafterzy/PCB_RPP_for_reID/blob/master/reid/utils/progress/setup.py | PCB/reid/utils/progress/setup.py | http://github.com/verigak/progress/ | 开源地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/PCB/reid/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/PnasNet5Large/public_address_statement.md b/PyTorch/contrib/cv/classification/PnasNet5Large/public_address_statement.md index 3c71808d7fab4551e3d15a5d2a624e299a46c843..0c78e2338a280a2acaa32c9a243358c8a7658884 100644 --- a/PyTorch/contrib/cv/classification/PnasNet5Large/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/PnasNet5Large/public_address_statement.md @@ -1,12 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | PnasNet5Large/pnasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | PnasNet5Large/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | PnasNet5Large/utils/progress/setup.py | http://github.com/verigak/progress/ | 开源地址 | -| 开发引入 | / | PnasNet5Large/imagenet_fast.py | https://github.com/NVIDIA/apex/tree/f5cd5ae937f168c763985f627bbf850648ea5f3f/examples/imagenet | 源码实现 | -| 开发引入 | / | PnasNet5Large/crossentropy.py | https://arxiv.org/pdf/1512.00567.pdf | 论文地址 | -| 开发引入 | / | PnasNet5Large/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开发引入 | / | PnasNet5Large/utils/misc.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py#L247-L262 | 源码实现 | -| 开发引入 | / | PnasNet5Large/pnasnet.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开发引入 | / | PnasNet5Large/imagenet_fast.py | https://www.github.com/nvidia/apex | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/PnasNet5Large/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/PnasNet5Large/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/RegNetX_ID4127_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/RegNetX_ID4127_for_PyTorch/public_address_statement.md index 9e718da97aba5e4391cee44de84df1b793bce24b..2e90bdafcb8240a8c198fa7d7b329bddc7ee1aad 100644 --- a/PyTorch/contrib/cv/classification/RegNetX_ID4127_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/RegNetX_ID4127_for_PyTorch/public_address_statement.md @@ -1,9 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | RegNetX_ID4127_for_PyTorch/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | RegNetX_ID4127_for_PyTorch/utils/progress/setup.py | http://github.com/verigak/progress/ | 开源地址 | -| 开发引入 | / | RegNetX_ID4127_for_PyTorch/imagenet_fast.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/pycls/models/blocks.py | RegNetX_ID4127_for_PyTorch/utils/blocks.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py | 源码实现 | -| 开发引入 | / | RegNetX_ID4127_for_PyTorch/imagenet_fast.py | https://github.com/NVIDIA/apex/tree/f5cd5ae937f168c763985f627bbf850648ea5f3f/examples/imagenet | 源码实现 | -| 开发引入 | / | RegNetX_ID4127_for_PyTorch/utils/misc.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py#L247-L262 | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/RegNetX_ID4127_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/RegNetX_ID4127_for_PyTorch/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/RegNetY_ID4128_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/RegNetY_ID4128_for_PyTorch/public_address_statement.md index 746bbe83da97713cc411ced3ffaee8632227c078..bfd4fb811ce2b38f31b1186c10af75a86c979f74 100644 --- a/PyTorch/contrib/cv/classification/RegNetY_ID4128_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/RegNetY_ID4128_for_PyTorch/public_address_statement.md @@ -1,9 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | RegNetY_ID4128_for_PyTorch/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | RegNetY_ID4128_for_PyTorch/utils/progress/setup.py | http://github.com/verigak/progress/ | 开源地址 | -| 开发引入 | / | RegNetY_ID4128_for_PyTorch/imagenet_fast.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开发引入 | / | RegNetY_ID4128_for_PyTorch/utils/misc.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py#L247-L262 | 源码实现 | -| 开发引入 | / | RegNetY_ID4128_for_PyTorch/imagenet_fast.py | https://github.com/NVIDIA/apex/tree/f5cd5ae937f168c763985f627bbf850648ea5f3f/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/pycls/pycls/models/blocks.py | RegNetY_ID4128_for_PyTorch/utils/model/blocks.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/functional.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/RegNetY_ID4128_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/RegNetY_ID4128_for_PyTorch/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ReidStrongBaseline/public_address_statement.md b/PyTorch/contrib/cv/classification/ReidStrongBaseline/public_address_statement.md index b10edfec59668c1f564f473783da8c04f1adff55..c6e76c87c9cbcebcca056273700af906133d0b5c 100644 --- a/PyTorch/contrib/cv/classification/ReidStrongBaseline/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ReidStrongBaseline/public_address_statement.md @@ -1,61 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/data/datasets/dukemtmcreid.py | ReidStrongBaseline/data/datasets/dukemtmcreid.py | http://vision.cs.duke.edu/DukeMTMC/data/misc/DukeMTMC-reID.zip | 下载数据集 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/transforms/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/data/datasets/market1501.py | ReidStrongBaseline/data/datasets/market1501.py | http://www.liangzheng.org/Project/project_reid.html | 相关说明 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/data/datasets/dukemtmcreid.py | ReidStrongBaseline/data/datasets/dukemtmcreid.py | https://github.com/layumi/DukeMTMC-reID_evaluation | 源码实现 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/projects/FastAttr/fastattr/datasets/dukemtmcattr.py | ReidStrongBaseline/data/datasets/cuhk03.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/finetune.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/solver/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | ReidStrongBaseline/utils/re_ranking.py | http://openaccess.thecvf.com/content_cvpr_2017/papers/Zhong_Re-Ranking_Person_Re-Identification_CVPR_2017_paper.pdf | 论文地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/utils/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/layers/triplet_loss.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/datasets/eval_reid.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/engine/trainer.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/datasets/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/modeling/baseline.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/modeling/backbones/resnet.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/build.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | ReidStrongBaseline/data/datasets/eval_reid.py | https://en.wikipedia.org/wiki/Evaluation_measures_ | 相关说明 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/utils/reid_metric.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/projects/FastAttr/fastattr/datasets/dukemtmcattr.py | ReidStrongBaseline/data/transforms/build.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/data/datasets/veri.py | ReidStrongBaseline/data/datasets/veri.py | https://vehiclereid.github.io/VeRi/ | 相关说明 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/datasets/bases.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/solver/build.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/tools/train_8P.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/data/datasets/msmt17.py | ReidStrongBaseline/data/datasets/msmt17.py | http://www.pkuvmc.com/publications/msmt17.html | 相关说明 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/evaluation/rerank.py | ReidStrongBaseline/utils/re_ranking.py | https://github.com/zhunzhong07/person-re-ranking | 源码实现 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/utils/logger.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/projects/FastAttr/fastattr/datasets/dukemtmcattr.py | ReidStrongBaseline/data/transforms/transforms.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/datasets/dataset_loader.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/projects/FastAttr/fastattr/datasets/dukemtmcattr.py | ReidStrongBaseline/data/samplers/triplet_sampler.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/solver/lr_scheduler.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | ReidStrongBaseline/data/datasets/cuhk03.py | http://www.ee.cuhk.edu.hk/~xgwang/CUHK_identification.html# | 相关说明 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/tools/train_1P.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/projects/FastAttr/fastattr/datasets/dukemtmcattr.py | ReidStrongBaseline/data/datasets/dukemtmcreid.py | liaoxingyu2@jd.com | 邮箱地址 | -| 开发引入 | / | ReidStrongBaseline/modeling/backbones/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/layers/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/utils/iotools.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/tools/test.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/fastreid/modeling/losses/triplet_loss.py | ReidStrongBaseline/layers/triplet_loss.py | https://github.com/Cysu/open-reid | 源码实现 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/modeling/backbones/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/modeling/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | ReidStrongBaseline/data/transforms/transforms.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/datasets/market1501.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/config/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/engine/inference.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/samplers/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/tests/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开发引入 | / | ReidStrongBaseline/data/samplers/triplet_sampler.py | https://github.com/Cysu/open-reid/blob/master/reid/utils/data/sampler.py | 源码实现 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/data/collate_batch.py | sherlockliao01@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/JDAI-CV/fast-reid/blob/master/tools/train_net.py | ReidStrongBaseline/tools/__init__.py | sherlockliao01@gmail.com | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|----------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ReidStrongBaseline/data/datasets/dukemtmcreid.py | http://vision.cs.duke.edu/DukeMTMC/data/misc/DukeMTMC-reID.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ReidStrongBaseline/data/datasets/market1501.py | http://www.liangzheng.org/Project/project_reid.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ReidStrongBaseline/data/datasets/msmt17.py | http://www.pkuvmc.com/publications/msmt17.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ReidStrongBaseline/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ReidStrongBaseline/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ReidStrongBaseline/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/RepVGG/public_address_statement.md b/PyTorch/contrib/cv/classification/RepVGG/public_address_statement.md index 9ccd7ba18c22b867638fae4b980297957d899837..283ef21c46f0fb6422f871751730f629c089f20e 100644 --- a/PyTorch/contrib/cv/classification/RepVGG/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/RepVGG/public_address_statement.md @@ -1,6 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/DingXiaoH/RepVGG/se_block.py | RepVGG/se_block.py | https://openaccess.thecvf.com/content_cvpr_2018/html/Hu_Squeeze-and-Excitation_Networks_CVPR_2018_paper.html | 相关说明 | -| 开源代码引入 | https://github.com/DingXiaoH/RepVGG/quantization/repvgg_quantized.py | RepVGG/quantization/repvgg_quantized.py | https://pytorch.org/tutorials/advanced/static_quantization_tutorial.html | 相关说明 | -| 开源代码引入 | https://github.com/DingXiaoH/RepVGG/example_pspnet.py | RepVGG/example_pspnet.py | https://github.com/hszhao/semseg | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/RepVGG/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/RepVGG/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Res2Net101_v1b/public_address_statement.md b/PyTorch/contrib/cv/classification/Res2Net101_v1b/public_address_statement.md index 1801517774aed644e0e5a2b7f3bfec994a937a78..1a10371fadd418480d1aaf16cd7596cc1f5f4bd4 100644 --- a/PyTorch/contrib/cv/classification/Res2Net101_v1b/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Res2Net101_v1b/public_address_statement.md @@ -1,5 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_v1b_26w_4s-3cf99910.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net101_v1b_26w_4s-0812c246.pth | 下载权重文件 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------|--------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Res2Net101_v1b/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Res2Net101_v1b/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Res2Net101_v1b/url.ini | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_v1b_26w_4s-3cf99910.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Res2Net101_v1b/url.ini | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net101_v1b_26w_4s-0812c246.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/public_address_statement.md index 55ba810c9d7f6abf59fedc1b4ab12381c3f6e9cb..9c68202c75803c521f4c26568df96187a540f5b6 100644 --- a/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/public_address_statement.md @@ -1,17 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/blob/master/resnest/torch/models/resnest.py | ResNeSt50_for_PyTorch/module/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/releases/download/weights_step1/{}-{}.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://s3.us-west-1.wasabisys.com/resnest/module/{}-{}.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/blob/master/scripts/dataset/prepare_imagenet.py | ResNeSt50_for_PyTorch/module/prepare_imagenet.py | https://raw.githubusercontent.com/soumith/imagenetloader.torch/master/valprep.sh | 下载数据集 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/module/datasets/imagenet.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/train_npu.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/resnest/torch/transforms/autoaug.py | ResNeSt50_for_PyTorch/module/transforms/autoaug.py | https://github.com/kakaobrain/fast-autoaugment | 源码实现 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/resnest/torch/transforms/autoaug.py | ResNeSt50_for_PyTorch/module/transforms/autoaug.py | https://github.com/rpmcruz/autoaugment | 源码实现 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/module/models/resnet.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/module/transforms/build.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/module/models/resnest.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/module/transforms/autoaug.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/tests/test_torch.py | ResNeSt50_for_PyTorch/module/transforms/transforms.py | zhanghang0704@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/resnest/torch/utils.py | ResNeSt50_for_PyTorch/module/utils.py | zhang.hang@rutgers.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/zhanghang1989/ResNeSt/resnest/torch/transforms/transforms.py | ResNeSt50_for_PyTorch/module/transforms/transforms.py | https://github.com/kakaobrain/fast-autoaugment/blob/master/FastAutoAugment/data.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/module/prepare_imagenet.py | https://raw.githubusercontent.com/soumith/imagenetloader.torch/master/valprep.sh | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeSt50_for_PyTorch/url.ini | https://s3.us-west-1.wasabisys.com/resnest/module | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/public_address_statement.md index b4f9175fe916f70878220a6fd0867728a79df9a2..5729d87301c90d9a913a25e2c51d381f066e1b28 100644 --- a/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/public_address_statement.md @@ -1,18 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | ResNeXt-50-32x4d_ID1624_for_PyTorch/models/resnet_0_6_0.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开发引入 | / | ResNeXt-50-32x4d_ID1624_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开发引入 | / | ResNeXt-50-32x4d_ID1624_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | ResNeXt-50-32x4d_ID1624_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | ResNeXt-50-32x4d_ID1624_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | ResNeXt-50-32x4d_ID1624_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt-50-32x4d_ID1624_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/public_address_statement.md index bd47fbf9545b2934e34b157187f69333a7b958b2..59ff680e883e6fe7c2aaff8e39d4661a2961b1a9 100644 --- a/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/public_address_statement.md @@ -1,18 +1,15 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | ResNeXt101_32x8d_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开发引入 | / | ResNeXt101_32x8d_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | ResNeXt101_32x8d_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开发引入 | / | ResNeXt101_32x8d_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | ResNeXt101_32x8d_for_PyTorch/models/resnet_0_6_0.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开发引入 | / | ResNeXt101_32x8d_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNeXt101_32x8d_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/public_address_statement.md index 944e18ff8604029193452c357f28559849e6b826..94e11807ea543d90bfcbefb256abdf75fb123b84 100644 --- a/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/public_address_statement.md @@ -1,12 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | ResNet101_ID1595_for_PyTorch/resnet.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开发引入 | / | ResNet101_ID1595_for_PyTorch/resnet.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开发引入 | / | ResNet101_ID1595_for_PyTorch/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | ResNet101_ID1595_for_PyTorch/resnet.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet101_ID1595_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNet152/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNet152/public_address_statement.md index 59995acca5646d7d4ae736852e8ee5221c2b48c4..66ca980d09004f237e60514eb8c1a4fb229bbd67 100644 --- a/PyTorch/contrib/cv/classification/ResNet152/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNet152/public_address_statement.md @@ -1,18 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | ResNet152/models/resnet_0_6_0.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | ResNet152/models/resnet_0_6_0.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | ResNet152/models/resnet_0_6_0.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/examples.git/fast_neural_style/neural_style/transformer_net.py | ResNet152/models/resnet_0_6_0.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | ResNet152/models/resnet_0_6_0.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开发引入 | / | ResNet152/models/resnet_0_6_0.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/ModelArts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet152/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNet18_ID1593_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNet18_ID1593_for_PyTorch/public_address_statement.md index 9a10981ca3f106a33da1442aa1bfb8bfabca6952..8f35132024aa6477ac0d3ff6fa2b872a7d6b8374 100644 --- a/PyTorch/contrib/cv/classification/ResNet18_ID1593_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNet18_ID1593_for_PyTorch/public_address_statement.md @@ -1,3 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet18_ID1593_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet18_ID1593_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/public_address_statement.md index ecae460c79d8a5479efc2cac672d947e78d45ee6..911b6c35a823c730f50397ffad1b83e9573c329f 100644 --- a/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/public_address_statement.md @@ -1,18 +1,13 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | ResNet34_ID1594_for_PyTorch/models/resnet.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开源代码引入 | https://github.com/pytorch/examples.git/fast_neural_style/neural_style/transformer_net.py | ResNet34_ID1594_for_PyTorch/models/resnet.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | ResNet34_ID1594_for_PyTorch/models/resnet.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开发引入 | / | ResNet34_ID1594_for_PyTorch/models/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | ResNet34_ID1594_for_PyTorch/models/resnet.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | ResNet34_ID1594_for_PyTorch/models/resnet.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ResNet34_ID1594_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/SE-ResNet-50/public_address_statement.md b/PyTorch/contrib/cv/classification/SE-ResNet-50/public_address_statement.md index 7b44aeeac8e73e04de184762748674b2e17a86fd..48ecb9f003362bcc0fab28a1644ae331d601b383 100644 --- a/PyTorch/contrib/cv/classification/SE-ResNet-50/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/SE-ResNet-50/public_address_statement.md @@ -1,4 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://github.com/moskomule/senet.pytorch/releases/download/archive/seresnet50-60a8950a85b2b.pkl | 下载权重文件 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SE-ResNet-50/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SE-ResNet-50/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SE-ResNet-50/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/SE-ResNext-101-32x4d/public_address_statement.md b/PyTorch/contrib/cv/classification/SE-ResNext-101-32x4d/public_address_statement.md index db7b1df4203667ae945a62ac0b64a08f4a7882dc..c7b2d094d9304be9ef2051c36917b6c253894602 100644 --- a/PyTorch/contrib/cv/classification/SE-ResNext-101-32x4d/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/SE-ResNext-101-32x4d/public_address_statement.md @@ -1,5 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | SE-ResNext-101-32x4d/models.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开发引入 | / | SE-ResNext-101-32x4d/models.py | https://github.com/open-mmlab/mmclassification/blob/master/configs/_base_/models/seresnext101_32x4d.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SE-ResNext-101-32x4d/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/SENet154/public_address_statement.md b/PyTorch/contrib/cv/classification/SENet154/public_address_statement.md index fceedb9d59b0f3175da2c0e840b54ba471650067..23f0fa53a9a664bdf8634c4c7cf8df5925a07878 100644 --- a/PyTorch/contrib/cv/classification/SENet154/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/SENet154/public_address_statement.md @@ -1,12 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开发引入 | / | SENet154/lsr.py | https://arxiv.org/pdf/1512.00567.pdf | 论文地址 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/pretrainedmodels/utils.py | SENet154/data.py | https://github.com/tensorflow/models/blob/master/research/inception/inception/image_processing.py#L294 | 源码实现 | -| 开源代码引入 | https://github.com/Cadene/pretrained-models.pytorch/pretrainedmodels/models/senet.py | SENet154/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SENet154/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/public_address_statement.md index bcffceb751ed467df68450e676b2c28c270df498..00d4d2e8f79df7e1ebf85ca76299c10aced1cea3 100644 --- a/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/public_address_statement.md @@ -1,594 +1,86 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/setup.py | SPNASNet_100_for_PyTorch/setup.py | https://github.com/rwightman/pytorch-image-models | 开源地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/setup.py | SPNASNet_100_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/cspnet.py | SPNASNet_100_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/cspnet.py | SPNASNet_100_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/cspnet.py | SPNASNet_100_for_PyTorch/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_xception.py | SPNASNet_100_for_PyTorch/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_resnet_v2.py | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_resnet_v2.py | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v4.py | SPNASNet_100_for_PyTorch/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nasnet.py | SPNASNet_100_for_PyTorch/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0_ra2-45c6688d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nfnet.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_730.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_781.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_809.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_820.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_distill_746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_distill_791.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_distill_819.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_distill_840.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pnasnet.py | SPNASNet_100_for_PyTorch/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/rexnet.py | SPNASNet_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/rexnet.py | SPNASNet_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/rexnet.py | SPNASNet_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/rexnet.py | SPNASNet_100_for_PyTorch/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/sknet.py | SPNASNet_100_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/sknet.py | SPNASNet_100_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/sknet.py | SPNASNet_100_for_PyTorch/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tnt.py | SPNASNet_100_for_PyTorch/timm/models/tnt.py | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开发引入 | / | url.ini | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_80_8-dbc13962.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception.py | SPNASNet_100_for_PyTorch/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception_aligned.py | SPNASNet_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception_aligned.py | SPNASNet_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception_aligned.py | SPNASNet_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/scheduler/scheduler.py | SPNASNet_100_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/nadam.py | SPNASNet_100_for_PyTorch/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/sknet.py | SPNASNet_100_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_builder.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/auto_augment.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/scheduler/tanh_lr.py | SPNASNet_100_for_PyTorch/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/sgdp.py | SPNASNet_100_for_PyTorch/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/real_labels.py | SPNASNet_100_for_PyTorch/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/misc.py | SPNASNet_100_for_PyTorch/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_jit.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/drop.py | SPNASNet_100_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/drop.py | SPNASNet_100_for_PyTorch/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/eca.py | SPNASNet_100_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/drop.py | SPNASNet_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adabelief.py | SPNASNet_100_for_PyTorch/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x1.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/.train.py.swo | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/agc.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/weight_init.py | SPNASNet_100_for_PyTorch/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/sgdp.py | SPNASNet_100_for_PyTorch/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/cspnet.py | SPNASNet_100_for_PyTorch/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/loss/jsd.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/sgdp.py | SPNASNet_100_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/model.py | SPNASNet_100_for_PyTorch/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R101x3.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/model.py | SPNASNet_100_for_PyTorch/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/rmsprop_tf.py | SPNASNet_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/scheduler/cosine_lr.py | SPNASNet_100_for_PyTorch/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/eca.py | SPNASNet_100_for_PyTorch/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | license地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/README.md | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://github.com/naver-ai/pit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/split_attn.py | SPNASNet_100_for_PyTorch/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vgg.py | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/auto_augment.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/nadam.py | SPNASNet_100_for_PyTorch/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/scheduler/scheduler.py | SPNASNet_100_for_PyTorch/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/model.py | SPNASNet_100_for_PyTorch/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/agc.py | SPNASNet_100_for_PyTorch/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/inplace_abn.py | SPNASNet_100_for_PyTorch/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/std_conv.py | SPNASNet_100_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/rexnet.py | SPNASNet_100_for_PyTorch/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/auto_augment.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/random_erasing.py | SPNASNet_100_for_PyTorch/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1801.04381v4 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_builder.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/mixup.py | SPNASNet_100_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/auto_augment.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/densenet.py | SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/cbam.py | SPNASNet_100_for_PyTorch/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adamw.py | SPNASNet_100_for_PyTorch/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/rexnet.py | SPNASNet_100_for_PyTorch/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_xception.py | SPNASNet_100_for_PyTorch/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/eca.py | SPNASNet_100_for_PyTorch/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/loss/jsd.py | SPNASNet_100_for_PyTorch/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/cond_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/agc.py | SPNASNet_100_for_PyTorch/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/cspnet.py | SPNASNet_100_for_PyTorch/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/radam.py | SPNASNet_100_for_PyTorch/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/model.py | SPNASNet_100_for_PyTorch/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/train_1p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/lookahead.py | SPNASNet_100_for_PyTorch/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/rmsprop_tf.py | SPNASNet_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hrnet.py | SPNASNet_100_for_PyTorch/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_resnet.py | SPNASNet_100_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/README.md | SPNASNet_100_for_PyTorch/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/nvnovograd.py | SPNASNet_100_for_PyTorch/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/novograd.py | SPNASNet_100_for_PyTorch/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tresnet.py | SPNASNet_100_for_PyTorch/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception.py | SPNASNet_100_for_PyTorch/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adafactor.py | SPNASNet_100_for_PyTorch/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/radam.py | SPNASNet_100_for_PyTorch/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pit.py | SPNASNet_100_for_PyTorch/timm/models/pit.py | https://arxiv.org/abs/2103.16302 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/cond_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2004.14525 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/layers/se.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/model.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adafactor.py | SPNASNet_100_for_PyTorch/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_builder.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/README.md | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/real_labels.py | SPNASNet_100_for_PyTorch/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/parsers/parser_tfds.py | SPNASNet_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/model_ema.py | SPNASNet_100_for_PyTorch/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/eca.py | SPNASNet_100_for_PyTorch/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/drop.py | SPNASNet_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/random_erasing.py | SPNASNet_100_for_PyTorch/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_jit.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/sknet.py | SPNASNet_100_for_PyTorch/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adahessian.py | SPNASNet_100_for_PyTorch/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2102.05610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/train_8p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dpn.py | SPNASNet_100_for_PyTorch/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/parsers/parser_tfds.py | SPNASNet_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/rmsprop_tf.py | SPNASNet_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/modelarts/train_start.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/mixed_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/scheduler/cosine_lr.py | SPNASNet_100_for_PyTorch/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/sgdp.py | SPNASNet_100_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_resnet_v2.py | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/train_1p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/hardcorenas.py | SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tnt.py | SPNASNet_100_for_PyTorch/timm/models/tnt.py | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/research/cv/TNT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/mixed_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer.py | SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vision_transformer_hybrid.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/split_attn.py | SPNASNet_100_for_PyTorch/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/README.md | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_resnet_v2.py | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/docs/models.md | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/nadam.py | SPNASNet_100_for_PyTorch/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/drop.py | SPNASNet_100_for_PyTorch/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/loader.py | SPNASNet_100_for_PyTorch/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/regnet.py | SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/mixed_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/vovnet.py | SPNASNet_100_for_PyTorch/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/lookahead.py | SPNASNet_100_for_PyTorch/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/agc.py | SPNASNet_100_for_PyTorch/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_builder.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/byobnet.py | SPNASNet_100_for_PyTorch/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/features.py | SPNASNet_100_for_PyTorch/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/nasnet.py | SPNASNet_100_for_PyTorch/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/timm/models/selecsls.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/pnasnet.py | SPNASNet_100_for_PyTorch/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x3.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/eca.py | SPNASNet_100_for_PyTorch/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x2.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/train_8p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/tnt.py | SPNASNet_100_for_PyTorch/timm/models/tnt.py | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/cond_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception.py | SPNASNet_100_for_PyTorch/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/sknet.py | SPNASNet_100_for_PyTorch/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/sgdp.py | SPNASNet_100_for_PyTorch/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/nvnovograd.py | SPNASNet_100_for_PyTorch/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/auto_augment.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开发引入 | / | SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/mobilenetv3.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adamw.py | SPNASNet_100_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/nvnovograd.py | SPNASNet_100_for_PyTorch/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/mixup.py | SPNASNet_100_for_PyTorch/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/dla.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/gluon_xception.py | SPNASNet_100_for_PyTorch/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/rmsprop_tf.py | SPNASNet_100_for_PyTorch/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/loss/jsd.py | SPNASNet_100_for_PyTorch/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/rmsprop_tf.py | SPNASNet_100_for_PyTorch/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/res2net.py | SPNASNet_100_for_PyTorch/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnetv2.py | SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x4.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/mixup.py | SPNASNet_100_for_PyTorch/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnet.py | SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/utils/agc.py | SPNASNet_100_for_PyTorch/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/cond_conv2d.py | SPNASNet_100_for_PyTorch/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/activations_me.py | SPNASNet_100_for_PyTorch/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/parsers/parser_tfds.py | SPNASNet_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/README.md | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/selecsls.py | SPNASNet_100_for_PyTorch/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/rmsprop_tf.py | SPNASNet_100_for_PyTorch/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/layers/split_attn.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/train.py | SPNASNet_100_for_PyTorch/.train.py.swo | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_resnet_v2.py | SPNASNet_100_for_PyTorch/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/resnest.py | SPNASNet_100_for_PyTorch/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_builder.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/loss/jsd.py | SPNASNet_100_for_PyTorch/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/data/parsers/parser_tfds.py | SPNASNet_100_for_PyTorch/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet_blocks.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/efficientnet.py | SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/optim/adamp.py | SPNASNet_100_for_PyTorch/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/senet.py | SPNASNet_100_for_PyTorch/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/xception_aligned.py | SPNASNet_100_for_PyTorch/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/18bf520ad12297dac4f9992ce497030259ca1aa2/timm/models/inception_v3.py | SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0533.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x5.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SPNASNet_100_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Se-ResNext-50-32x4d/public_address_statement.md b/PyTorch/contrib/cv/classification/Se-ResNext-50-32x4d/public_address_statement.md index 7b44aeeac8e73e04de184762748674b2e17a86fd..51fade4825ca719e952093688cbea075bd3dae9e 100644 --- a/PyTorch/contrib/cv/classification/Se-ResNext-50-32x4d/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Se-ResNext-50-32x4d/public_address_statement.md @@ -1,4 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://github.com/moskomule/senet.pytorch/releases/download/archive/seresnet50-60a8950a85b2b.pkl | 下载权重文件 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Se-ResNext-50-32x4d/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/public_address_statement.md index 5b5ccaab1172f25753ca64fab7802994c79db9d2..e35b46bcfa293d273a8b1bc6ddacfaa7b05f1f85 100644 --- a/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/public_address_statement.md @@ -1,5 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | http://mirrors.aliyun.com/pypi/simple | 镜像源 | -| 开发引入 | / | ShuffleNetV2Plus_ID1626_for_PyTorch/channel_shuffle.py | https://github.com/pytorch/vision/blob/master/torchvision/models/shufflenetv2.py#L21 | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/ShuffleNetV2Plus_ID1626_for_PyTorch/url.ini | http://mirrors.aliyun.com/pypi/simple | 相关配置 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/SkresNet50/public_address_statement.md b/PyTorch/contrib/cv/classification/SkresNet50/public_address_statement.md index aaca8a0a75e5c9d38deeb800b9df27d0c645361c..840d5193a6a64a2a96109332fd72853f51da8fa5 100644 --- a/PyTorch/contrib/cv/classification/SkresNet50/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/SkresNet50/public_address_statement.md @@ -1,11 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | SkresNet50/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/verigak/progress/blob/master/setup.py | SkresNet50/utils/progress/setup.py | http://github.com/verigak/progress/ | 开源地址 | -| 开发引入 | / | SkresNet50/network.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | SkresNet50/utils/misc.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py#L247-L262 | 源码实现 | -| 开发引入 | / | SkresNet50/imagenet_fast.py | https://github.com/NVIDIA/apex/tree/f5cd5ae937f168c763985f627bbf850648ea5f3f/examples/imagenet | 源码实现 | -| 开发引入 | / | SkresNet50/modelarts/train_start.py | https://www.github.com/nvidia/apex | 源码实现 | -| 开发引入 | / | SkresNet50/imagenet_fast.py | https://www.github.com/nvidia/apex | 源码实现 | -| 开发引入 | / | SkresNet50/modelarts/train_start.py | https://github.com/NVIDIA/apex/tree/ | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SkresNet50/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SkresNet50/utils/progress/setup.py | verigak@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/SqueezeNet1_1/public_address_statement.md b/PyTorch/contrib/cv/classification/SqueezeNet1_1/public_address_statement.md index 12bccc6cde474f4b9ebe55348b17aea098861cc1..41a83742e9df34abefa378699f6bd0e6bb314ca2 100644 --- a/PyTorch/contrib/cv/classification/SqueezeNet1_1/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/SqueezeNet1_1/public_address_statement.md @@ -1,7 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/squeezenet1_0-b66bff10.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/squeezenet1_1-b8a52dc0.pth | 下载权重文件 | -| 开发引入 | / | SqueezeNet1_1/squeezenet.py | https://arxiv.org/abs/1602.07360 | 论文地址 | -| 开发引入 | / | SqueezeNet1_1/squeezenet.py | https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1 | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SqueezeNet1_1/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SqueezeNet1_1/main_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SqueezeNet1_1/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SqueezeNet1_1/url.ini | https://download.pytorch.org/models/squeezenet1_1-b8a52dc0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/SqueezeNet1_1/url.ini | https://download.pytorch.org/models/squeezenet1_0-b66bff10.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/TNT/README_raw.md b/PyTorch/contrib/cv/classification/TNT/README_raw.md index a6a44196adcfa8ee41b875a9fa6750d6a59b393b..24659f2a24ce64728a8ad98f5b01480930bdb242 100644 --- a/PyTorch/contrib/cv/classification/TNT/README_raw.md +++ b/PyTorch/contrib/cv/classification/TNT/README_raw.md @@ -16,10 +16,10 @@ python -m torch.distributed.launch --nproc_per_node=8 train.py /path/to/imagenet - Pretrained models -|Model|Params (M)|FLOPs (B)|Top-1|Top-5|URL| -|-|-|-|-|-|-| -|TNT-S|23.8|5.2|81.5|95.7|[[BaiduDisk]](https://pan.baidu.com/s/1AwJDWEPl-hqLHfUvqmlqxQ), Password: 7ndi| -|TNT-B|65.6|14.1|82.9|96.3|[[BaiduDisk]](https://pan.baidu.com/s/1_TemN7kvWuYeZohisObQ1w), Password: 2gb7| +|Model|Params (M)|FLOPs (B)|Top-1|Top-5| +|-|-|-|-|-| +|TNT-S|23.8|5.2|81.5|95.7| +|TNT-B|65.6|14.1|82.9|96.3| - Evaluate example: ``` diff --git a/PyTorch/contrib/cv/classification/TResNet/public_address_statement.md b/PyTorch/contrib/cv/classification/TResNet/public_address_statement.md index 344ebe97979e2ed9c17c28b54faccd860a483beb..e31dfff40cd10f04a4da64eafdd7a653a8a7fc48 100644 --- a/PyTorch/contrib/cv/classification/TResNet/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/TResNet/public_address_statement.md @@ -1,548 +1,78 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/cspnet.py | TResNet/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/cspnet.py | TResNet/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/cspnet.py | TResNet/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_xception.py | TResNet/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_resnet_v2.py | TResNet/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_resnet_v2.py | TResNet/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v4.py | TResNet/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nasnet.py | TResNet/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0c-ad1045c2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/pnasnet.py | TResNet/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_160-d64013cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/rexnet.py | TResNet/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/rexnet.py | TResNet/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/rexnet.py | TResNet/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/rexnet.py | TResNet/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/selecsls.py | TResNet/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/selecsls.py | TResNet/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/selecsls.py | TResNet/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/sknet.py | TResNet/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/sknet.py | TResNet/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/sknet.py | TResNet/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_80_8-dbc13962.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception.py | TResNet/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception_aligned.py | TResNet/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception_aligned.py | TResNet/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception_aligned.py | TResNet/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x1.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/rmsprop_tf.py | TResNet/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | TResNet/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/agc.py | TResNet/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 相关说明 | -| 开发引入 | / | TResNet/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/loss/jsd.py | TResNet/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/auto_augment.py | TResNet/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/adamw.py | TResNet/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/agc.py | TResNet/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开发引入 | / | TResNet/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/loss/jsd.py | TResNet/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/parsers/parser_tfds.py | TResNet/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet_builder.py | TResNet/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/mixup.py | TResNet/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/parsers/parser_tfds.py | TResNet/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/sknet.py | TResNet/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/rmsprop_tf.py | TResNet/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/rmsprop_tf.py | TResNet/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/sknet.py | TResNet/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet_builder.py | TResNet/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/mixup.py | TResNet/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/sgdp.py | TResNet/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/novograd.py | TResNet/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/radam.py | TResNet/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开发引入 | / | TResNet/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 相关依赖 | -| 开发引入 | / | TResNet/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/scheduler/cosine_lr.py | TResNet/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x4.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开发引入 | / | TResNet/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception_aligned.py | TResNet/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/sgdp.py | TResNet/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_jit.py | TResNet/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/adamw.py | TResNet/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/rexnet.py | TResNet/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/train.py | TResNet/train_finetune_1p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/adafactor.py | TResNet/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/drop.py | TResNet/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/regnet.py | TResNet/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/cspnet.py | TResNet/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/random_erasing.py | TResNet/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/real_labels.py | TResNet/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/agc.py | TResNet/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_xception.py | TResNet/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/sgdp.py | TResNet/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/train.py | TResNet/train.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/cspnet.py | TResNet/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/loss/jsd.py | TResNet/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/cbam.py | TResNet/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception.py | TResNet/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/nvnovograd.py | TResNet/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_resnet_v2.py | TResNet/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet_builder.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/weight_init.py | TResNet/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/scheduler/cosine_lr.py | TResNet/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/tresnet.py | TResNet/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/docs/models.md | TResNet/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/agc.py | TResNet/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/README.md | TResNet/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/mixed_conv2d.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开发引入 | / | TResNet/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dla.py | TResNet/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/auto_augment.py | TResNet/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/loss/jsd.py | TResNet/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/train.py | TResNet/train_finetune_1p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/nadam.py | TResNet/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/auto_augment.py | TResNet/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开发引入 | / | TResNet/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/agc.py | TResNet/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/cond_conv2d.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/nadam.py | TResNet/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/eca.py | TResNet/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/mobilenetv3.py | TResNet/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/features.py | TResNet/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x2.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/eca.py | TResNet/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nfnet.py | TResNet/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/README.md | TResNet/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开发引入 | / | TResNet/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/drop.py | TResNet/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet_builder.py | TResNet/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/pnasnet.py | TResNet/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/drop.py | TResNet/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/scheduler/tanh_lr.py | TResNet/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/lookahead.py | TResNet/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_resnet_v2.py | TResNet/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/adamp.py | TResNet/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/train.py | TResNet/perf.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_resnet.py | TResNet/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/res2net.py | TResNet/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/split_attn.py | TResNet/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/parsers/parser_tfds.py | TResNet/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x3.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_jit.py | TResNet/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/gluon_xception.py | TResNet/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/radam.py | TResNet/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/loader.py | TResNet/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/densenet.py | TResNet/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/rmsprop_tf.py | TResNet/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/split_attn.py | TResNet/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/lookahead.py | TResNet/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vgg.py | TResNet/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet_builder.py | TResNet/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/mixed_conv2d.py | TResNet/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/adahessian.py | TResNet/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/sgdp.py | TResNet/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开发引入 | / | TResNet/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | TResNet/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/std_conv.py | TResNet/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnetv2.py | TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R101x3.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_v3.py | TResNet/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/train.py | TResNet/perf.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/auto_augment.py | TResNet/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/inception_resnet_v2.py | TResNet/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/nadam.py | TResNet/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/cond_conv2d.py | TResNet/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/eca.py | TResNet/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/selecsls.py | TResNet/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/drop.py | TResNet/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/sknet.py | TResNet/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/sgdp.py | TResNet/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/dpn.py | TResNet/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/model_ema.py | TResNet/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnest.py | TResNet/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/README.md | TResNet/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/nasnet.py | TResNet/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vision_transformer.py | TResNet/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | TResNet/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/auto_augment.py | TResNet/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/real_labels.py | TResNet/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/inplace_abn.py | TResNet/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/rexnet.py | TResNet/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/eca.py | TResNet/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/nvnovograd.py | TResNet/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/activations_me.py | TResNet/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/byobnet.py | TResNet/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/cond_conv2d.py | TResNet/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/resnet.py | TResNet/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/adafactor.py | TResNet/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/rmsprop_tf.py | TResNet/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/vovnet.py | TResNet/timm/models/layers/se.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/mixed_conv2d.py | TResNet/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/random_erasing.py | TResNet/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/parsers/parser_tfds.py | TResNet/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/README.md | TResNet/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/drop.py | TResNet/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/efficientnet.py | TResNet/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/hrnet.py | TResNet/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/utils/misc.py | TResNet/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/data/mixup.py | TResNet/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/eca.py | TResNet/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/rmsprop_tf.py | TResNet/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/layers/cond_conv2d.py | TResNet/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/train.py | TResNet/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/optim/nvnovograd.py | TResNet/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/scheduler/scheduler.py | TResNet/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/xception.py | TResNet/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/selecsls.py | TResNet/timm/models/selecsls.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/models/senet.py | TResNet/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/v0.4.5/timm/scheduler/scheduler.py | TResNet/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0534.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x6.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/TResNet/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/VOLO/README_raw.md b/PyTorch/contrib/cv/classification/VOLO/README_raw.md index de72694cbe9df8edae3439f74dc6a6d3ae12e6c2..ba492302016bd20af8a6ca8f5f26e5b47dcf1b05 100644 --- a/PyTorch/contrib/cv/classification/VOLO/README_raw.md +++ b/PyTorch/contrib/cv/classification/VOLO/README_raw.md @@ -86,8 +86,6 @@ Directory structure in this repo: | volo_d5 ↑448| 296M | 448 | 87.0 | [here](https://github.com/sail-sg/volo/releases/download/volo_1/d5_448_87.0.pth.tar) | | volo_d5 ↑512| 296M | 512 | 87.1 | [here](https://github.com/sail-sg/volo/releases/download/volo_1/d5_512_87.07.pth.tar) | -All the pretrained models can also be downloaded by [BaiDu Yun](https://pan.baidu.com/s/1l7NfploIiZX9WbTPdwT3rQ) (password: ttbp). - ### Usage Instructions on how to use our pre-trained VOLO models: ```python3 diff --git a/PyTorch/contrib/cv/classification/VOLO/public_address_statement.md b/PyTorch/contrib/cv/classification/VOLO/public_address_statement.md index 54c962cd83caf6fb05c6616656e8c8bae91ea13b..b51293b3003eef6e42f5e0f26dde6c257544aa54 100644 --- a/PyTorch/contrib/cv/classification/VOLO/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/VOLO/public_address_statement.md @@ -1,209 +1,31 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|------| -| 开发引入 | / | VOLO/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 数据集地址 | -| 开发引入 | / | VOLO/tlt/data/mixup.py | https://github.com/naver-ai/relabel_imagenet/blob/main/utils/relabel_functions.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/global_context.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开发引入 | / | VOLO/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/sail-sg/volo/models/volo.py | VOLO/models/volo.py | https://github.com/zihangJiang/TokenLabeling | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/std_conv.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/non_local_attn.py | https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html | 相关说明 | -| 开发引入 | / | VOLO/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/gather_excite.py | https://github.com/hujie-frank/GENet | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/swin_attn.py | https://arxiv.org/pdf/2103.14030.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/gather_excite.py | https://arxiv.org/abs/1810.12348 | 论文地址 | -| 开发引入 | / | VOLO/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 相关说明 | -| 开发引入 | / | VOLO/tlt/data/random_augment_label.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/auto_augment.py | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开发引入 | / | VOLO/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开发引入 | / | VOLO/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/mlp.py | https://arxiv.org/abs/1612.08083","https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开发引入 | / | VOLO/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 预训练模型 | -| 开发引入 | / | VOLO/tlt/data/loader.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/loader.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | VOLO/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开发引入 | / | VOLO/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开发引入 | / | VOLO/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 相关说明 | -| 开源代码引入 | https://github.com/sail-sg/volo/validate.py | VOLO/validate.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/swin_attn.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 相关说明 | -| 开源代码引入 | https://github.com/sail-sg/volo/utils/utils.py | VOLO/utils/utils.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 相关说明 | -| 开发引入 | / | VOLO/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/involution.py | https://github.com/d-li14/involution/blob/main/cls/mmcls/models/utils/involution_naive.py | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/lambda_layer.py | https://github.com/lucidrains/lambda-networks | 源码实现 | -| 开发引入 | / | VOLO/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开发引入 | / | VOLO/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/involution.py | https://arxiv.org/abs/2103.06255 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 预训练模型 | -| 开源代码引入 | https://github.com/sail-sg/volo/main.py | VOLO/main.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/sail-sg/volo/main.py | VOLO/main.py | https://github.com/rwightman/pytorch-image-models/ | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开发引入 | / | VOLO/tlt/data/dataset.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开发引入 | / | VOLO/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://arxiv.org/abs/2103.07579 | 论文地址 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开发引入 | / | VOLO/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开发引入 | / | VOLO/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 相关说明 | -| 开发引入 | / | VOLO/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开发引入 | / | VOLO/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开发引入 | / | VOLO/demo.py | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 图片地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开发引入 | / | VOLO/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开发引入 | / | VOLO/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开发引入 | / | VOLO/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/patch_embed.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开发引入 | / | VOLO/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开发引入 | / | VOLO/tlt/data/mixup.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/data/mixup.py | 源码实现 | -| 开发引入 | / | VOLO/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开发引入 | / | VOLO/timm/models/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 论文地址 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开发引入 | / | VOLO/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开发引入 | / | VOLO/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/tensorflow/tpu/tree/bee9c4f6/models/official/resnet/resnet_rs | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/global_context.py | https://github.com/xvjiarui/GCNet | 源码实现 | -| 开发引入 | / | VOLO/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开发引入 | / | VOLO/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/sail-sg/volo/utils/utils.py | VOLO/tlt/utils/utils.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/std_conv.py | https://github.com/joe-siyuan-qiao/WeightStandardization | 源码实现 | -| 开发引入 | / | VOLO/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 预训练模型 | -| 开发引入 | / | VOLO/tlt/models/layers.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开发引入 | / | VOLO/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | VOLO/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 预训练模型 | -| 开发引入 | / | VOLO/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开发引入 | / | VOLO/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0531.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/VOLO/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/public_address_statement.md index 3e298fd1a4f085c80b5833c4cfe4d9638cbf5266..eba2ca8bce5b7d642544d038190522181de5a188 100644 --- a/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/public_address_statement.md @@ -1,11 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 预训练模型 | -| 开发引入 | / | Vgg16_ID1630_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 预训练模型 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-----------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg16_ID1630_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/public_address_statement.md index d48b7845f4e4b4b82b520b324a33cbba2a476e47..fcbf7f617a194561275652710c59c96bb216f3fc 100644 --- a/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/public_address_statement.md @@ -1,11 +1,12 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 预训练模型 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开发引入 | / | Vgg19_ID1631_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 预训练模型 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|-----------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/modelarts/start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vgg19_ID1631_for_PyTorch/url.ini | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Vit_small_patch16_224/public_address_statement.md b/PyTorch/contrib/cv/classification/Vit_small_patch16_224/public_address_statement.md index bc1fc62f46e560473a8d77c8e6b53c5ab6f10d2c..4e3d35b4d1fa7c2aa0a2c29a07b31091f1a0c0a1 100644 --- a/PyTorch/contrib/cv/classification/Vit_small_patch16_224/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Vit_small_patch16_224/public_address_statement.md @@ -1,685 +1,214 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 测试图片 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet51q_ra2-d47dcc76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_tiny-473c2a20.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_mini-2c6baf49.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_tiny-461b07a7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_mini-d7842000.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_small-fea1d5a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/convit.py | Vit_small_patch16_224/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/convit.py | Vit_small_patch16_224/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/convit.py | Vit_small_patch16_224/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cspnet.py | Vit_small_patch16_224/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cspnet.py | Vit_small_patch16_224/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cspnet.py | Vit_small_patch16_224/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | Vit_small_patch16_224/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | Vit_small_patch16_224/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | Vit_small_patch16_224/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | Vit_small_patch16_224/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | Vit_small_patch16_224/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dpn.py | Vit_small_patch16_224/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_t_agc-3620981a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/ghostnet.py | Vit_small_patch16_224/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/releases/download/ghostnet_pth/ghostnet_1x.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_resnet.py | Vit_small_patch16_224/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/gluon_xception.py | Vit_small_patch16_224/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hrnet.py | Vit_small_patch16_224/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v4.py | Vit_small_patch16_224/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/levit.py | Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/levit.py | Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/levit.py | Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/levit.py | Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/levit.py | Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmlp_s16_224_raa-10536d42.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nasnet.py | Vit_small_patch16_224/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nest.py | Vit_small_patch16_224/timm/models/nest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/jx_nest_base-8bc41011.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nest.py | Vit_small_patch16_224/timm/models/nest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/jx_nest_small-422eaded.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nest.py | Vit_small_patch16_224/timm/models/nest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/jx_nest_tiny-e3428fb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0_ra2-45c6688d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l1_ra2-7dce93cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l2_ra3-da781a61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nfnet.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_730.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_781.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_809.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_820.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_distill_746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_distill_791.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_distill_819.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_distill_840.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pnasnet.py | Vit_small_patch16_224/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/rexnet.py | Vit_small_patch16_224/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/rexnet.py | Vit_small_patch16_224/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/rexnet.py | Vit_small_patch16_224/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/rexnet.py | Vit_small_patch16_224/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/selecsls.py | Vit_small_patch16_224/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/selecsls.py | Vit_small_patch16_224/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/selecsls.py | Vit_small_patch16_224/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/senet.py | Vit_small_patch16_224/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/sknet.py | Vit_small_patch16_224/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/sknet.py | Vit_small_patch16_224/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/sknet.py | Vit_small_patch16_224/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tnt.py | Vit_small_patch16_224/timm/models/tnt.py | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tresnet.py | Vit_small_patch16_224/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_base-e5ecb09b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_large-d273f802.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_small-42e5f78c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_base-c2265010.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_large-90f6aaa9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/visformer.py | Vit_small_patch16_224/timm/models/visformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/visformer_small-839e1f5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_32.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_16.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | Vit_small_patch16_224/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | Vit_small_patch16_224/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception.py | Vit_small_patch16_224/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception_aligned.py | Vit_small_patch16_224/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception_aligned.py | Vit_small_patch16_224/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception_aligned.py | Vit_small_patch16_224/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | Vit_small_patch16_224/timm/optim/npu_fused_adamp.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/coat.py | Vit_small_patch16_224/timm/models/coat.py | https://discuss.pytorch.org/t/how-to-keep-the-shape-of-input-and-output-same-when-dilation-conv/14338 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | Vit_small_patch16_224/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | Vit_small_patch16_224/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadamw.py | Vit_small_patch16_224/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/halo_attn.py | Vit_small_patch16_224/timm/models/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://github.com/rwightman/pytorch-image-models/pull/747#issuecomment-877795721 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/loss/jsd.py | Vit_small_patch16_224/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model.py | Vit_small_patch16_224/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_hybrid.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | Vit_small_patch16_224/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | Vit_small_patch16_224/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | Vit_small_patch16_224/timm/optim/npu_fused_adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | Vit_small_patch16_224/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/google/automl/tree/master/efficientnetv2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | Vit_small_patch16_224/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/res2net.py | Vit_small_patch16_224/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | Vit_small_patch16_224/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://arxiv.org/abs/2103.07579 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/non_local_attn.py | Vit_small_patch16_224/timm/models/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | Vit_small_patch16_224/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/lambda_layer.py | Vit_small_patch16_224/timm/models/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadamw.py | Vit_small_patch16_224/timm/optim/npu_fused_adamp.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://arxiv.org/abs/2106.05237 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | Vit_small_patch16_224/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | Vit_small_patch16_224/timm/optim/npu_fused_adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/efficientnet.py | 3.5.7.9 | 相关说明 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/layers/mlp.py | https://arxiv.org/abs/1612.08083","https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/halo_attn.py | Vit_small_patch16_224/timm/models/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 论文地址 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/agc.py | Vit_small_patch16_224/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/densenet.py | Vit_small_patch16_224/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adamw.py | Vit_small_patch16_224/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/scheduler.py | Vit_small_patch16_224/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/pit.py | Vit_small_patch16_224/timm/models/pit.py | https://arxiv.org/abs/2103.16302 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vgg.py | Vit_small_patch16_224/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | Vit_small_patch16_224/timm/optim/npu_fused_adamp.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/swin_transformer.py | Vit_small_patch16_224/timm/models/swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/agc.py | Vit_small_patch16_224/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://github.com/facebookresearch/xcit/blob/master/xcit.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/tanh_lr.py | Vit_small_patch16_224/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/bottleneck_attn.py | Vit_small_patch16_224/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/random_erasing.py | Vit_small_patch16_224/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adafactor.py | Vit_small_patch16_224/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cspnet.py | Vit_small_patch16_224/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cait.py | Vit_small_patch16_224/timm/models/cait.py | https://arxiv.org/pdf/2003.02436v1.pdf | 论文地址 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/rmsprop_tf.py | Vit_small_patch16_224/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/loss/jsd.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | Vit_small_patch16_224/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/halo_attn.py | Vit_small_patch16_224/timm/models/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nest.py | Vit_small_patch16_224/timm/models/nest.py | https://arxiv.org/abs/2105.12723 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model.py | Vit_small_patch16_224/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | Vit_small_patch16_224/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_jit.py | Vit_small_patch16_224/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/eca.py | Vit_small_patch16_224/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnetv2.py | Vit_small_patch16_224/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/inplace_abn.py | Vit_small_patch16_224/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/halo_attn.py | Vit_small_patch16_224/timm/models/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vovnet.py | Vit_small_patch16_224/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://github.com/rwightman/pytorch-image-models/tree/master/timm | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.03404 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/loss/jsd.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/eca.py | Vit_small_patch16_224/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/hardcorenas.py | Vit_small_patch16_224/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/std_conv.py | Vit_small_patch16_224/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1801.04381v4 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/cond_conv2d.py | Vit_small_patch16_224/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2004.14525 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/auto_augment.py | Vit_small_patch16_224/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mobilenetv3.py | Vit_small_patch16_224/timm/models/mobilenetv3.py | https://arxiv.org/abs/2006.02049 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adabelief.py | Vit_small_patch16_224/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://arxiv.org/abs/2103.17239 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/scheduler.py | Vit_small_patch16_224/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_builder.py | Vit_small_patch16_224/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/random_erasing.py | Vit_small_patch16_224/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/cspnet.py | Vit_small_patch16_224/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/cosine_lr.py | Vit_small_patch16_224/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/scheduler/cosine_lr.py | Vit_small_patch16_224/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/nest.py | Vit_small_patch16_224/timm/models/nest.py | https://github.com/google-research/nested-transformer/issues/2 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2102.05610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadam.py | Vit_small_patch16_224/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/dla.py | Vit_small_patch16_224/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xcit.py | Vit_small_patch16_224/timm/models/xcit.py | https://github.com/facebookresearch/deit/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_v3.py | Vit_small_patch16_224/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_sam.py | Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/tnt.py | Vit_small_patch16_224/timm/models/tnt.py | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://arxiv.org/abs/2106.01548 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/tensorflow/tpu/tree/bee9c4f6/models/official/resnet/resnet_rs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model.py | Vit_small_patch16_224/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model_ema.py | Vit_small_patch16_224/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/levit.py | Vit_small_patch16_224/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_jit.py | Vit_small_patch16_224/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.08050 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/nadamw.py | Vit_small_patch16_224/timm/optim/npu_fused_adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resmlp_24_224_raa-a8256759.pth | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/readers/reader_tfds.py | Vit_small_patch16_224/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model.py | Vit_small_patch16_224/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/mixed_conv2d.py | Vit_small_patch16_224/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/data/readers/reader_tfds.py | Vit_small_patch16_224/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/metaformer.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/weight_init.py | Vit_small_patch16_224/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer_sam.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://github.com/whai362/PVT.git | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/misc.py | Vit_small_patch16_224/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adabelief.py | Vit_small_patch16_224/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | Vit_small_patch16_224/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnest.py | Vit_small_patch16_224/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/vision_transformer.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/halo_attn.py | Vit_small_patch16_224/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/deit.py | Vit_small_patch16_224/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet_blocks.py | https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/optim/adabelief.py | Vit_small_patch16_224/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/mlp_mixer.py | Vit_small_patch16_224/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.01601 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/resnet.py | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/non_local_attn.py | Vit_small_patch16_224/timm/models/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/sknet.py | Vit_small_patch16_224/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/_efficientnet_blocks.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | Vit_small_patch16_224/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/twins.py | Vit_small_patch16_224/timm/models/twins.py | https://arxiv.org/abs/2102.10882 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/regnet.py | Vit_small_patch16_224/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/cond_conv2d.py | Vit_small_patch16_224/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | Vit_small_patch16_224/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/drop.py | Vit_small_patch16_224/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/loss/jsd.py | Vit_small_patch16_224/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/byobnet.py | Vit_small_patch16_224/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开发引入 | / | Vit_small_patch16_224/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/utils/model.py | Vit_small_patch16_224/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/layers/activations_me.py | Vit_small_patch16_224/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/inception_resnet_v2.py | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/efficientnet.py | Vit_small_patch16_224/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/timm/models/xception.py | Vit_small_patch16_224/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/train.py | Vit_small_patch16_224/train.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/main/README.md | Vit_small_patch16_224/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npzx | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/vit_base_patch32_224/vit_train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_32.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_16.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Vit_small_patch16_224/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/public_address_statement.md index b7f7c8c29b88731daf57a790c349abc723e0d944..8878a1b59f39b5cc00a9c24623541f1362003f6a 100644 --- a/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/public_address_statement.md @@ -1,19 +1,16 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/Dockerfile | https://github.com/NVIDIA/apex | 源码实现 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/models/resnet_0_6_0.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | WideResNet101_2_for_Pytorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/main_npu_1p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/main_npu_8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet101_2_for_Pytorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/public_address_statement.md index bd92db65d8d3944a8f936df6675b55d8a2aab89d..3dda23d3e5d221626303d00c0b6812801ceaff1c 100644 --- a/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/public_address_statement.md @@ -1,18 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 测试图片 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载权重文件 | -| 开发引入 | / | url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开发引入 | / | WideResNet50_2_ID1627_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1605.07146.pdf | 论文地址 | -| 开发引入 | / | WideResNet50_2_ID1627_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | WideResNet50_2_ID1627_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1611.05431.pdf | 论文地址 | -| 开发引入 | / | WideResNet50_2_ID1627_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | WideResNet50_2_ID1627_for_PyTorch/models/resnet_0_6_0.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | WideResNet50_2_ID1627_for_PyTorch/models/resnet_0_6_0.py | https://ngc.nvidia.com/catalog/model-scripts/nvidia:resnet_50_v1_5_for_pytorch | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/modelarts/train_modelarts.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/WideResNet50_2_ID1627_for_PyTorch/url.ini | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/public_address_statement.md index b9fb6839a85a56a1dd5c305f342f10c9b135e83b..c23d7402f81535ff6ec6fe3304e3a5c4591207ea 100644 --- a/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/public_address_statement.md @@ -1,5 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 测试图片 | -| 开发引入 | / | url.ini | https://www.dropbox.com/s/1hplpzet9d7dv29/xception-c0a72b38.pth.tar?dl=1 | 下载权重文件 | -| 开源代码引入 | https://github.com/kwotsin/TensorFlow-Xception/xception.py | Xception_ID1777_for_PyTorch/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/modelarts/train_start_xception.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/url.ini | https://www.dropbox.com/s/1hplpzet9d7dv29/xception-c0a72b38.pth.tar?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/Xception_ID1777_for_PyTorch/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/convmixer/public_address_statement.md b/PyTorch/contrib/cv/classification/convmixer/public_address_statement.md index aa614dc176028707215026196287e3e6aff52fb7..3d2205f80323d4a4a3b111bb26d8df0d05512a52 100644 --- a/PyTorch/contrib/cv/classification/convmixer/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/convmixer/public_address_statement.md @@ -1,842 +1,163 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ |------| -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/mkdocs.yml | convmixer/mkdocs.yml | https://github.com/rwightman/pytorch-image-models | 开源地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/mkdocs.yml | convmixer/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 下载配置信息 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/mkdocs.yml | convmixer/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 下载配置信息 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/model-index.yml | convmixer/model-index.yml | https://rwightman.github.io/pytorch-image-models/ | 开源地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/model-index.yml | convmixer/model-index.yml | https://github.com/rwightman/pytorch-image-models | 开源文档 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/setup.py | convmixer/setup.py | https://github.com/rwightman/pytorch-image-models | 开源地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/setup.py | convmixer/setup.py | hello@rwightman.com | 作者邮箱 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_384_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_224_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_384_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_512_pt22k_ft22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_224_pt22k_ft22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/botnet26t_c1_256-167a0e9f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/halonet26t_256-9b4bf0b3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/sehalonet33ts_256-87e053f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/halonet50ts_256_ra3-f07eab9f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/eca_halonext26ts_256-1e55880b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/lambda_resnet26t_a2h_256-25ded63d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byoanet.py | convmixer/timm/models/byoanet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/lambda_resnet26rpt_a2h_256-482adad8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet51q_ra2-d47dcc76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet61q_ra2-6afc536c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/resnext26ts_256_ra2-8bbd9106.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/gcresnext26ts_256-e414378b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/seresnext26ts_256-6f0d74a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/eca_resnext26ts_256-5a1d030f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/bat_resnext26ts_256-fa6fd595.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/resnet32ts_256-aacf5250.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/resnet33ts_256-e91b09a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/gcresnet33ts_256-0e0cd345.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/seresnet33ts_256-f8ad44d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/eca_resnet33ts_256-8f98face.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/gcresnet50t_256-96374d1c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/gcresnext50ts_256-3e0f515e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_tiny-473c2a20.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_mini-2c6baf49.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_tiny-461b07a7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_mini-d7842000.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-coat-weights/coat_lite_small-fea1d5a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/convit.py | convmixer/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/convit.py | convmixer/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/convit.py | convmixer/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_15_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_15_dagger_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_15_dagger_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_18_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_18_dagger_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_18_dagger_384.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_9_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_9_dagger_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_base_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_small_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/releases/download/weights-0.1/crossvit_tiny_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cspnet.py | convmixer/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cspnet.py | convmixer/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cspnet.py | convmixer/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_t_agc-3620981a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gc_efficientnetv2_rw_t_agc-927a0bde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21ft1k-06c35c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21k-fd7e8abf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/ghostnet.py | convmixer/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/releases/download/ghostnet_pth/ghostnet_1x.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_xception.py | convmixer/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_resnet_v2.py | convmixer/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_resnet_v2.py | convmixer/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v4.py | convmixer/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmlp_s16_224_raa-10536d42.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nasnet.py | convmixer/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nest.py | convmixer/timm/models/nest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/jx_nest_base-8bc41011.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nest.py | convmixer/timm/models/nest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/jx_nest_small-422eaded.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nest.py | convmixer/timm/models/nest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/jx_nest_tiny-e3428fb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nfnet_l0_ra2-45c6688d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l1_ra2-7dce93cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l2_ra3-da781a61.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nfnet.py | convmixer/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_730.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_781.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_809.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_820.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_ti_distill_746.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_xs_distill_791.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_s_distill_819.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-pit-weights/pit_b_distill_840.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pnasnet.py | convmixer/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-attn-weights/resnet26t_256_ra2-6f6fa748.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rsb-weights/resnet50_a1_0-14fe96d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 下载数据集 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rsb-weights/resnetv2_50_a1_h-000cdf49.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/rexnet.py | convmixer/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/rexnet.py | convmixer/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/rexnet.py | convmixer/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/rexnet.py | convmixer/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/selecsls.py | convmixer/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/selecsls.py | convmixer/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/selecsls.py | convmixer/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/sknet.py | convmixer/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/sknet.py | convmixer/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/sknet.py | convmixer/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22kto1k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window12_384_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_large_patch4_window7_224_22k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tnt.py | convmixer/timm/models/tnt.py | https://github.com/contrastive/pytorch-image-models/releases/download/TNT/tnt_s_patch16_224.pth.tar | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_small-e70e7e7a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_base-e5ecb09b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_pcpvt_large-d273f802.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_small-42e5f78c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_base-c2265010.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/twins_svt_large-90f6aaa9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/visformer.py | convmixer/timm/models/visformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vt3p-weights/visformer_small-839e1f5b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception_aligned.py | convmixer/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception_aligned.py | convmixer/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception_aligned.py | convmixer/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nest.py | convmixer/timm/models/nest.py | https://github.com/google-research/nested-transformer/issues/2 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/google/automl/tree/master/efficientnetv2 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/halo_attn.py | convmixer/timm/models/layers/halo_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/attention_pool2d.py | convmixer/timm/models/layers/attention_pool2d.py | https://github.com/openai/CLIP/blob/3b473b0e682c091a9e53623eebc1ca1657385717/clip/model.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/non_local_attn.py | convmixer/timm/models/layers/non_local_attn.py | https://github.com/facebookresearch/video-nonlocal-net | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://arxiv.org/abs/2104.06399 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/scheduler.py | convmixer/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://discuss.pytorch.org/t/how-to-keep-the-shape-of-input-and-output-same-when-dilation-conv/14338 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/loss/jsd.py | convmixer/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/lambda_layer.py | convmixer/timm/models/layers/lambda_layer.py | https://arxiv.org/abs/2102.08602 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_builder.py | convmixer/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/weight_init.py | convmixer/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/README.md | convmixer/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nest.py | convmixer/timm/models/nest.py | https://arxiv.org/abs/2105.12723 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/distributed_sampler.py | convmixer/timm/data/distributed_sampler.py | https://github.com/facebookresearch/deit/blob/0c4b8f60/samplers.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/README.md | convmixer/timm/models/pit.py | https://github.com/naver-ai/pit | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://arxiv.org/pdf/2104.13840.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/convert/convert_nest_flax.py | convmixer/timm/models/nest.py | https://github.com/google-research/nested-transformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/model_ema.py | convmixer/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/README.md | convmixer/timm/models/swin_transformer.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://arxiv.org/abs/2106.01548 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/mixed_conv2d.py | convmixer/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/bottleneck_attn.py | convmixer/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/2101.11605 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adahessian.py | convmixer/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开发引入 | / | convmixer/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/agc.py | convmixer/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/halo_attn.py | convmixer/timm/models/layers/bottleneck_attn.py | https://gist.github.com/aravindsrinivas/56359b79f0ce4449bcb04ab4b56a57a2 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_builder.py | convmixer/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/real_labels.py | convmixer/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_resnet.py | convmixer/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/ghostnet.py | convmixer/timm/models/ghostnet.py | https://github.com/huawei-noah/CV-backbones/tree/master/ghostnet_pytorch | 源码实现 | -| 开发引入 | / | convmixer/timm/models/levit.py | https://github.com/facebookresearch/LeViT | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/rmsprop_tf.py | convmixer/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/model.py | convmixer/timm/utils/model.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/drop.py | convmixer/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/attention_pool2d.py | convmixer/timm/models/layers/attention_pool2d.py | https://github.com/lucidrains/vit-pytorch/blob/6f3a5fcf0bca1c5ec33a35ef48d97213709df4ba/vit_pytorch/rvt.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adamp.py | convmixer/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/model.py | convmixer/timm/utils/model.py | https://docs.fast.ai/callback.hook.html | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/cait.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_jit.py | convmixer/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/auto_augment.py | convmixer/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开发引入 | / | convmixer/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/sgdp.py | convmixer/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/rmsprop_tf.py | convmixer/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2004.14525 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://arxiv.org/abs/2103.14899 | 论文地址 | -| 开发引入 | / | convmixer/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/auto_augment.py | convmixer/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/loss/jsd.py | convmixer/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://github.com/facebookresearch/deit/ | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/tensorflow/tpu/tree/bee9c4f6/models/official/resnet/resnet_rs | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tnt.py | convmixer/timm/models/tnt.py | https://arxiv.org/abs/2103.00112 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/sknet.py | convmixer/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resmlp_24_224_raa-a8256759.pth | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/mobilenetv3.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/nvnovograd.py | convmixer/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/agc.py | convmixer/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/eca.py | convmixer/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/drop.py | convmixer/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/ghostnet.py | convmixer/timm/models/ghostnet.py | https://arxiv.org/abs/1911.11907 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tnt.py | convmixer/timm/models/tnt.py | https://gitee.com/mindspore/mindspore/tree/master/model_zoo/research/cv/TNT | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception_aligned.py | convmixer/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/docs/models.md | convmixer/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/sgdp.py | convmixer/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/eca.py | convmixer/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开发引入 | / | convmixer/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/madgrad.py | convmixer/timm/optim/madgrad.py | https://arxiv.org/abs/2101.11075 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/attention_pool2d.py | convmixer/timm/models/layers/attention_pool2d.py | https://blog.eleuther.ai/rotary-embeddings/ | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/drop.py | convmixer/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/cond_conv2d.py | convmixer/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/rexnet.py | convmixer/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开发引入 | / | convmixer/timm/models/efficientnet.py | 3.5.7.9 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/cosine_lr.py | convmixer/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/cbam.py | convmixer/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/convert/convert_nest_flax.py | convmixer/convert/convert_nest_flax.py | https://console.cloud.google.com/storage/browser/gresearch/nest-checkpoints | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/lamb.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/convert/convert_nest_flax.py | convmixer/convert/convert_nest_flax.py | https://github.com/google-research/nested-transformer/blob/main/models/nest_net.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vgg.py | convmixer/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/regnet.py | convmixer/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/nvnovograd.py | convmixer/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/mixed_conv2d.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/README.md | convmixer/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mobilenetv3.py | convmixer/timm/models/mobilenetv3.py | https://arxiv.org/abs/2006.02049 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://arxiv.org/abs/2103.07579 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/convit.py | convmixer/timm/models/convit.py | https://arxiv.org/abs/2103.10697 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/split_attn.py | convmixer/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/google-research/vision_transformer/blob/linen/vit_jax/models_mixer.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/visformer.py | convmixer/timm/models/visformer.py | https://arxiv.org/abs/2104.12533 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adabelief.py | convmixer/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/0a501dd51c02278d952cf159bc233037 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/beit.py | https://github.com/rwightman/pytorch-image-models/tree/master/timm | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/tanh_lr.py | convmixer/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/global_context.py | convmixer/timm/models/layers/global_context.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://arxiv.org/abs/2106.08254 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/convit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/crossvit.py | convmixer/timm/models/crossvit.py | https://github.com/IBM/CrossViT/blob/main/models/crossvit.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/cond_conv2d.py | convmixer/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/poly_lr.py | convmixer/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/2004.05909 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/std_conv.py | convmixer/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/eca.py | convmixer/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/visformer.py | convmixer/timm/models/visformer.py | https://github.com/danczs/Visformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/crossvit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/real_labels.py | convmixer/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/squeeze_excite.py | convmixer/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1709.01507 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pnasnet.py | convmixer/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/radam.py | convmixer/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.08050 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_resnet_v2.py | convmixer/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cspnet.py | convmixer/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/train.py | convmixer/train_npu.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/agc.py | convmixer/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/Alibaba-MIIL/ImageNet21K | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://arxiv.org/abs/2104.01136 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/mixed_conv2d.py | convmixer/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception.py | convmixer/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/scheduler.py | convmixer/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/nadam.py | convmixer/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/rmsprop_tf.py | convmixer/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://arxiv.org/abs/2102.10882 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开发引入 | / | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1801.04381v4 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/twins.py | convmixer/timm/models/twins.py | https://github.com/whai362/PVT.git | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet_blocks.py | https://ai.googleblog.com/2019/08/efficientnet-edgetpu-creating.html | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/madgrad.py | convmixer/timm/optim/madgrad.py | https://github.com/facebookresearch/madgrad | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/lamb.py | https://arxiv.org/abs/1904.00962 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/global_context.py | convmixer/timm/models/layers/global_context.py | https://github.com/xvjiarui/GCNet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/model.py | convmixer/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/model.py | convmixer/timm/utils/model.py | https://gist.github.com/amaarora/6e56942fcb46e67ba203f3009b30d950 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_builder.py | convmixer/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adabelief.py | convmixer/timm/optim/adabelief.py | https://github.com/juntang-zhuang/Adabelief-Optimizer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/radam.py | convmixer/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/lambda_layer.py | convmixer/timm/models/layers/lambda_layer.py | https://github.com/lucidrains/lambda-networks | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/auto_augment.py | convmixer/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/cond_conv2d.py | convmixer/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/pit.py | convmixer/timm/models/pit.py | https://arxiv.org/abs/2103.16302 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/densenet.py | convmixer/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://github.com/microsoft/unilm/tree/master/beit | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/loader.py | convmixer/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnest.py | convmixer/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/eca.py | convmixer/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/cait.py | https://arxiv.org/abs/2103.17239 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/gather_excite.py | convmixer/timm/models/layers/gather_excite.py | https://arxiv.org/abs/1810.12348 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/lamb.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/LanguageModeling/Transformer-XL/pytorch/lamb.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/nadam.py | convmixer/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/2104.00298 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adabelief.py | convmixer/timm/optim/adabelief.py | https://gist.github.com/juntang-zhuang/517ce3c27022b908bb93f78e4f786dc3 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/sknet.py | convmixer/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/lamb.py | https://github.com/cybertronai/pytorch-lamb | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_xception.py | convmixer/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开发引入 | / | convmixer/timm/models/layers/mlp.py | https://arxiv.org/abs/1612.08083","https://arxiv.org/abs/2002.05202 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/halo_attn.py | convmixer/timm/models/layers/halo_attn.py | https://arxiv.org/abs/2103.12731 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/non_local_attn.py | convmixer/timm/models/layers/non_local_attn.py | https://openaccess.thecvf.com/content_CVPR_2020/html/Chi_Non-Local_Neural_Networks_With_Grouped_Bilinear_Attentional_Transforms_CVPR_2020_paper.html | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/std_conv.py | convmixer/timm/models/layers/std_conv.py | https://github.com/joe-siyuan-qiao/WeightStandardization | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cspnet.py | convmixer/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lookahead.py | convmixer/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开发引入 | / | convmixer/timm/models/efficientnet.py | 3.5.7.9","3.5.7.9 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adafactor.py | convmixer/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/selecsls.py | convmixer/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/layers/patch_embed.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/halo_attn.py | convmixer/timm/models/layers/bottleneck_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_resnet_v2.py | convmixer/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/levit.py | convmixer/timm/models/levit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnetv2.py | convmixer/timm/models/resnetv2.py | https://arxiv.org/abs/2106.05237 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_jit.py | convmixer/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/README.md | convmixer/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/2102.05610 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/train.py | convmixer/train_npu.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adamw.py | convmixer/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/rexnet.py | convmixer/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开发引入 | / | convmixer/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开发引入 | / | convmixer/timm/models/twins.py | https://github.com/Meituan-AutoML/Twins | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/vision_transformer_hybrid.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/coat.py | convmixer/timm/models/coat.py | https://github.com/mlpc-ucsd/CoaT | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hrnet.py | convmixer/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/cosine_lr.py | convmixer/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_resnet_v2.py | convmixer/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://arxiv.org/abs/2106.10270 | 论文地址 | -| 开发引入 | / | convmixer/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/non_local_attn.py | convmixer/timm/models/layers/non_local_attn.py | https://github.com/BA-Transform/BAT-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/agc.py | convmixer/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dla.py | convmixer/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://github.com/rwightman/pytorch-image-models/tree/master/timm | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/parsers/parser_tfds.py | convmixer/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/agc.py | convmixer/timm/models/layers/std_conv.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://github.com/facebookresearch/xcit/blob/master/xcit.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/dpn.py | convmixer/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lookahead.py | convmixer/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/eca.py | convmixer/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/features.py | convmixer/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/misc.py | convmixer/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/tresnet.py | convmixer/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/cait.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开发引入 | / | convmixer/timm/models/convit.py | https://github.com/facebookresearch/convit | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lars.py | convmixer/timm/optim/lars.py | https://github.com/pytorch/pytorch/blob/1.7/torch/optim/sgd.py#L100 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/rmsprop_tf.py | convmixer/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/layers/squeeze_excite.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/adafactor.py | convmixer/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/random_erasing.py | convmixer/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_builder.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/parsers/parser_tfds.py | convmixer/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开发引入 | / | convmixer/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lars.py | convmixer/timm/optim/lars.py | https://github.com/NVIDIA/apex/blob/master/apex/parallel/LARC.py | 源码实现 | -| 开发引入 | / | convmixer/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/parsers/parser_tfds.py | convmixer/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 数据集地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_builder.py | convmixer/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.01601 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet_blocks.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开发引入 | / | convmixer/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/rmsprop_tf.py | convmixer/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/mixup.py | convmixer/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/model.py | convmixer/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/drop.py | convmixer/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/beit.py | convmixer/timm/models/beit.py | https://github.com/facebookresearch/dino | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/selecsls.py | convmixer/timm/models/selecsls.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/hardcorenas.py | convmixer/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/byobnet.py | convmixer/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/halo_attn.py | convmixer/timm/models/layers/halo_attn.py | https://arxiv.org/abs/1904.09925 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/swin_transformer.py | convmixer/timm/models/swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/sgdp.py | convmixer/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/loss/jsd.py | convmixer/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/sgdp.py | convmixer/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://arxiv.org/abs/2106.09681 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://arxiv.org/abs/2103.17239 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/selecsls.py | convmixer/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer_hybrid.py | convmixer/timm/models/vision_transformer_hybrid.py | https://arxiv.org/abs/2106.TODO | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/auto_augment.py | convmixer/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/lamb.py | https://github.com/pytorch/pytorch/issues/9190 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/nasnet.py | convmixer/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/split_attn.py | convmixer/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/mlp_mixer.py | convmixer/timm/models/mlp_mixer.py | https://arxiv.org/abs/2105.03404 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/nadam.py | convmixer/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开发引入 | / | convmixer/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开发引入 | / | convmixer/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/mixup.py | convmixer/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/utils/agc.py | convmixer/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/inplace_abn.py | convmixer/timm/models/layers/inplace_abn.py | inplace_abn.git@v1.0.12 | 邮箱地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/loss/jsd.py | convmixer/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/cond_conv2d.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/cait.py | convmixer/timm/models/cait.py | https://arxiv.org/pdf/2003.02436v1.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/scheduler/poly_lr.py | convmixer/timm/scheduler/poly_lr.py | https://arxiv.org/abs/2004.05909 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/README.md | convmixer/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vovnet.py | convmixer/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/gluon_xception.py | convmixer/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/xcit.py | https://github.com/rwightman/pytorch-image-models/pull/747#issuecomment-877795721 | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception.py | convmixer/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lars.py | convmixer/timm/optim/lars.py | https://arxiv.org/pdf/1708.03888.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet_blocks.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/drop.py | convmixer/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/mixup.py | convmixer/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/lamb.py | convmixer/timm/optim/lamb.py | https://github.com/HabanaAI/Model-References/blob/2b435114fe8e31f159b1d3063b8280ae37af7423/PyTorch/nlp/bert/pretraining/lamb.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/efficientnet.py | convmixer/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/gather_excite.py | convmixer/timm/models/layers/gather_excite.py | https://github.com/hujie-frank/GENet | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/random_erasing.py | convmixer/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/inception_v3.py | convmixer/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/senet.py | convmixer/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/resnet.py | convmixer/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/rmsprop_tf.py | convmixer/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/activations_me.py | convmixer/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xception.py | convmixer/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/sknet.py | convmixer/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/parsers/parser_tfds.py | convmixer/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/layers/inplace_abn.py | convmixer/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 邮箱地址 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/optim/sgdp.py | convmixer/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/data/auto_augment.py | convmixer/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/res2net.py | convmixer/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/vision_transformer.py | convmixer/timm/models/mlp_mixer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开发引入 | / | convmixer/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/locuslab/convmixer/blob/47048118e95721a00385bfe3122519f4b583b26e/pytorch-image-models/timm/models/xcit.py | convmixer/timm/models/beit.py | https://github.com/facebookresearch/deit/ | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_512_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_384_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_large_patch16_224_pt22k_ft22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_384_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22kto1k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/beit.py | https://unilm.blob.core.windows.net/beit/beit_base_patch16_224_pt22k_ft22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0532.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/convmixer/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/csp_resnext50-mish/public_address_statement.md b/PyTorch/contrib/cv/classification/csp_resnext50-mish/public_address_statement.md index 5fc78da514012e8e59c5d82756169865845f3406..8ca58e38e098725489c76e30b92b6d884f083c37 100644 --- a/PyTorch/contrib/cv/classification/csp_resnext50-mish/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/csp_resnext50-mish/public_address_statement.md @@ -1,564 +1,85 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/setup.py | csp_resnext50-mish/setup.py | https://github.com/rwightman/pytorch-image-models | 开源地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/setup.py | csp_resnext50-mish/setup.py | hello@rwightman.com | 作者邮箱 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_s-756b4751.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_m-0873c53a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-ger-weights/gernet_l-f31e2e8d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_a2-c1ee6d2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b0-80ac3f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1-77ca2989.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b1g4-abde5d92.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2-25b7494e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b2g4-165a85f2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3-199bc50d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-repvgg-weights/repvgg_b3g4-73c370bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/cspnet.py | csp_resnext50-mish/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnet50_ra-d3e8d487.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/cspnet.py | csp_resnext50-mish/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspresnext50_ra_224-648b4713.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/cspnet.py | csp_resnext50-mish/timm/models/cspnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/cspdarknet53_ra_256-d05c7c21.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenet121_ra-50efcf5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/densenetblur121d_ra-100dcfbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net_dla60_4s-d88db7f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next_dla60_4s-d327927b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn68-66bebafa7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/dpn68b_ra-a31ca160.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn92_extra-b040e4a9b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn98-5b90dec4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn131-71dfe43e0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/rwightman/pytorch-dpn-pretrained/releases/download/v0.1/dpn107_extra-1ac7121e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/gluon_xception.py | csp_resnext50-mish/timm/models/gluon_xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_xception-7015a15c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v1-f460c6bc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnet_w18_small_v2-4c50a8cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w18-8cb57bb9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w30-8d7f8dab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w32-90d8c5fb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w40-7cd397a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w44-c9ac8c18.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w48-abd2e6ab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-hrnet/hrnetv2_w64-b47cc881.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_resnet_v2.py | csp_resnext50-mish/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/inception_resnet_v2-940b1cd6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_resnet_v2.py | csp_resnext50-mish/timm/models/inception_resnet_v2.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ens_adv_inception_resnet_v2-2592a550.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_inception_v3-e0069de4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/adv_inception_v3-9e27bd63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_inception_v3-9f746940.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v4.py | csp_resnext50-mish/timm/models/inception_v4.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/inceptionv4-8e4777a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_large_100_ra-f55367f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv3_100-35495452.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_075-150ee8b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_100-427764d5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_large_minimal_100-8596ae28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_075-da427f52.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_100-37f49e2b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mobilenetv3.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mobilenetv3_small_minimal_100-922a7843.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nasnet.py | csp_resnext50-mish/timm/models/nasnet.py | http://data.lip6.fr/cadene/pretrainedmodels/nasnetalarge-a1897284.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f0-604f9c3a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f1-fc540f82.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f2-89875923.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f3-d74ab3aa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f4-0ac5b10b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f5-ecb20ab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-dnf-weights/dm_nfnet_f6-e0f12116.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecanfnet_l0_ra2-e3e9ac50.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_regnet_b1_256_ra2-ad85cfef.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nfnet.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/nf_resnet50_ra2-9f236009.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/pnasnet.py | csp_resnext50-mish/timm/models/pnasnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/pnasnet5large-bf079911.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_002-e7e85e5c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_004-7d0e9424.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_006-85ec1baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_008-d8b470eb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_016-65ca972a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_032-ed0c7f7e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_040-73c2a654.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_064-29278baa.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_080-7c7fcab1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_120-65d5521e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_160-c98c4112.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnetx_320-8ea38b93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_002-e68ca334.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_004-0db870e6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_006-c67e57ec.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_008-dc900dbe.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_016-54367f74.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/regnety_032_ra-7f2439f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_040-f0d569f9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_064-0a48325c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_080-e7f3eb93.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_120-721ba79a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_160-d64013cd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-regnet/regnety_320-ba464b29.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_4s-06e79181.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_48w_2s-afed724a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_14w_8s-6527dddc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_6s-19041792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net50_26w_8s-2c7c9f12.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2net101_26w_4s-02a759a1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-res2net/res2next50_4s-6ef7e7bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest14-9c8fe254.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gluon_resnest26-50eb607c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50-528c19ca.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest101-22405ba7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest200-75117900.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest269-0cc87c48.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_4s2x40d-41d14ed0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-resnest/resnest50_fast_1s4x24d-d4a4f76f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 下载数据集 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/rexnet.py | csp_resnext50-mish/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_100-1b4dddf4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/rexnet.py | csp_resnext50-mish/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_130-590d768e.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/rexnet.py | csp_resnext50-mish/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_150-bd1a6aa8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/rexnet.py | csp_resnext50-mish/timm/models/rexnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rexnet/rexnetv1_200-8c0b7f2d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls42b-8af30141.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60-bbf87526.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/timm/models/selecsls.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-selecsls/selecsls60b-94e619b5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet18-4bb0ce65.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet34-a4004e63.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet50-ce0d4300.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet101-7e38fcc6.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/se_resnet152-d17c99b7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26_32x4d-65ebdb501.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/sknet.py | csp_resnext50-mish/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet18_ra-4eec2804.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/sknet.py | csp_resnext50-mish/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnet34_ra-bdc0ccde.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/sknet.py | csp_resnext50-mish/timm/models/sknet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/skresnext50_ra-f40e40bf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_80_8-dbc13962.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_81_5-235b486c.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_82_0-a2d51b00.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_m_448-bc359d10.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_l_448-940d0cd1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-tresnet/tresnet_xl_448-8c1815de.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet19b_dw-a8741004.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/vovnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ese_vovnet39b-f912fe73.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception.py | csp_resnext50-mish/timm/models/xception.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-cadene/xception-43020ad28.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception_aligned.py | csp_resnext50-mish/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_41-e6439c97.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception_aligned.py | csp_resnext50-mish/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_65-c9ae96e8.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception_aligned.py | csp_resnext50-mish/timm/models/xception_aligned.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_xception_71-8eec7df1.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/agc.py | csp_resnext50-mish/timm/utils/agc.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/rexnet.py | csp_resnext50-mish/timm/models/rexnet.py | https://arxiv.org/abs/2007.00992 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/issues/494#issuecomment-532968956 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | https://github.com/pytorch/vision/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/layers/cond_conv2d.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://arxiv.org/abs/2101.03697 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/DingXiaoH/RepVGG | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/vovnet.py | https://github.com/youngwanLEE/vovnet-detectron2 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/layers/split_attn.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/drop.py | csp_resnext50-mish/timm/models/layers/drop.py | https://arxiv.org/abs/1603.09382 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/sgdp.py | csp_resnext50-mish/timm/optim/sgdp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer_sam.py | csp_resnext50-mish/timm/models/resnetv2.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt/blob/master/ablation.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/drop.py | csp_resnext50-mish/timm/models/layers/drop.py | https://arxiv.org/abs/1810.12890 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_builder.py | csp_resnext50-mish/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/readers/reader_tfds.py | csp_resnext50-mish/timm/data/parsers/parser_tfds.py | https://github.com/tensorflow/datasets | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_resnet_v2.py | csp_resnext50-mish/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/inception_resnet_v2_2016_08_30.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/timm/models/selecsls.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | http://download.tensorflow.org/models/inception_v3_2016_08_28.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/rmsprop_tf.py | csp_resnext50-mish/timm/optim/adamw.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/adafactor.py | csp_resnext50-mish/timm/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x4.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/vovnet.py | https://github.com/stigma0617/VoVNet.pytorch/blob/master/models_vovnet/vovnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/radam.py | csp_resnext50-mish/timm/optim/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/sgdp.py | csp_resnext50-mish/timm/optim/adamp.py | https://arxiv.org/abs/2006.08217 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1904.04971 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/nasnet.py | csp_resnext50-mish/timm/models/nasnet.py | https://github.com/Cadene/pretrained-models.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/lookahead.py | csp_resnext50-mish/timm/optim/lookahead.py | https://github.com/alphadl/lookahead.pytorch | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/models/sknet.py | https://github.com/clovaai/assembled-cnn | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/lookahead.py | csp_resnext50-mish/timm/optim/lookahead.py | https://arxiv.org/abs/1907.08610 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_resnet_v2.py | csp_resnext50-mish/timm/models/inception_resnet_v2.py | http://download.tensorflow.org/models/ens_adv_inception_resnet_v2_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/pnasnet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/readers/reader_tfds.py | csp_resnext50-mish/timm/data/parsers/parser_tfds.py | https://github.com/pytorch/pytorch/issues/33413 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations_me.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/cond_conv2d.py | csp_resnext50-mish/timm/models/layers/cond_conv2d.py | https://github.com/pytorch/pytorch/issues/17983 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/inplace_abn.py | csp_resnext50-mish/timm/models/layers/inplace_abn.py | https://github.com/mapillary/inplace_abn.git@v1.0.12 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nadam.py | csp_resnext50-mish/timm/optim/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/dla.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations_me.py | https://twitter.com/jeremyphoward/status/1188251041835315200 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/layers/se.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/sotabench.py | https://github.com/mehtadushy/SelecSLS-Pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/adafactor.py | csp_resnext50-mish/timm/optim/adafactor.py | https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/adamw.py | csp_resnext50-mish/timm/optim/adamw.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nadam.py | csp_resnext50-mish/timm/optim/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/vgg.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/gasvn/Res2Net/blob/master/res2net.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | https://gluon-cv.mxnet.io/model_zoo/classification.html | 相关说明 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnet.py | https://arxiv.org/pdf/1812.01187 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/byobnet.py | csp_resnext50-mish/timm/models/byobnet.py | https://arxiv.org/abs/2006.14090 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/data/random_erasing.py | https://github.com/zhunzhong07/Random-Erasing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/lucidrains/vit-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/split_attn.py | csp_resnext50-mish/timm/models/resnest.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_builder.py | csp_resnext50-mish/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mnasnet_models.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_builder.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations_me.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/adamp.py | csp_resnext50-mish/timm/optim/adamp.py | https://github.com/clovaai/AdamP/blob/master/adamp/adamp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1904.02877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/cspnet.py | csp_resnext50-mish/timm/models/cspnet.py | https://arxiv.org/abs/1911.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1812.03443 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/mixup.py | csp_resnext50-mish/timm/data/mixup.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/model_ema.py | csp_resnext50-mish/timm/utils/model_ema.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://github.com/Alibaba-MIIL/HardCoReNAS | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/dmlc/gluon-cv/blob/master/gluoncv/model_zoo/resnet.py | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/models/inception_v4.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations_jit.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/facebookresearch/semi-supervised-ImageNet1K-models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_features.py | csp_resnext50-mish/timm/models/features.py | https://github.com/pytorch/vision/blob/d88d8961ae51507d0cb680329d985b1488b1b76b/torchvision/models/_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/facebookresearch/WSL-Images | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/models/gluon_xception.py | https://gluon-cv.mxnet.io/_modules/gluoncv/model_zoo/xception.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/edgetpu | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/timm/models/selecsls.py | https://creativecommons.org/licenses/by/4.0/legalcode | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nvnovograd.py | csp_resnext50-mish/timm/optim/novograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/README.md | csp_resnext50-mish/timm/models/resnet.py | https://arxiv.org/abs/1905.00546 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/inception_resnet_v2.py | https://github.com/Cadene/tensorflow-model-zoo.torch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://github.com/gasvn/Res2Net/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_blocks.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1807.11626 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/eca.py | csp_resnext50-mish/timm/models/layers/eca.py | https://arxiv.org/abs/1910.03151 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/sknet.py | csp_resnext50-mish/timm/models/sknet.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1911.09665 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/vovnet.py | https://arxiv.org/abs/1911.06667 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/gluon_xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/eca.py | csp_resnext50-mish/timm/models/layers/eca.py | https://github.com/VRandme | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://github.com/KaimingHe/resnet-1k-layers/blob/master/resnet-pre-act.lua | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | https://github.com/HRNet/HRNet-Image-Classification | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/sgdp.py | csp_resnext50-mish/timm/optim/adamp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/misc.py | csp_resnext50-mish/timm/utils/misc.py | http://www.codinghorror.com/blog/archives/001018.html | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/pnasnet.py | csp_resnext50-mish/timm/models/pnasnet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/pnasnet.py | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1705.07204 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nadamw.py | csp_resnext50-mish/timm/optim/adamw.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/Cadene/pretrained-models.pytorch/blob/master/pretrainedmodels/models/senet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/tag/efficientnet_v1.0 | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/optim/novograd.py | https://github.com/convergence-lab/novograd | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/agc.py | csp_resnext50-mish/timm/models/nfnet.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hardcorenas.py | csp_resnext50-mish/timm/models/hardcorenas.py | https://arxiv.org/abs/2102.11646 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/loss/jsd.py | csp_resnext50-mish/timm/loss/jsd.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_modeldef.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/README.md | csp_resnext50-mish/timm/models/pnasnet.py | https://arxiv.org/abs/1712.00559 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1908.08681 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nvnovograd.py | csp_resnext50-mish/timm/optim/nvnovograd.py | https://github.com/NVIDIA/DeepLearningExamples/blob/master/PyTorch/SpeechRecognition/Jasper | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1911.04252 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/optim/npu_fused_sgd.py | http://www.cs.toronto.edu/%7Ehinton/absps/momentum.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/selecsls.py | csp_resnext50-mish/timm/models/selecsls.py | https://arxiv.org/abs/1907.00837 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/auto_augment.py | csp_resnext50-mish/timm/data/auto_augment.py | https://arxiv.org/abs/1805.09501 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/train.py | csp_resnext50-mish/train-1p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/readers/reader_tfds.py | csp_resnext50-mish/timm/data/parsers/parser_tfds.py | https://www.tensorflow.org/datasets/catalog/overview#image_classification | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/sgdp.py | csp_resnext50-mish/timm/optim/sgdp.py | https://github.com/clovaai/AdamP/blob/master/adamp/sgdp.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_builder.py | csp_resnext50-mish/timm/models/efficientnet_builder.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/efficientnet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vovnet.py | csp_resnext50-mish/timm/models/vovnet.py | https://arxiv.org/abs/1904.09730 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/creafz | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/lite | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/deit.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://arxiv.org/abs/2012.12877 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/cspnet.py | csp_resnext50-mish/timm/models/cspnet.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/cbam.py | csp_resnext50-mish/timm/models/layers/cbam.py | https://arxiv.org/abs/1807.06521 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/inception_resnet_v2.py | https://github.com/tensorflow/models/tree/master/research/adv_imagenet_models | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/real_labels.py | csp_resnext50-mish/timm/data/real_labels.py | https://github.com/google-research/reassessed-imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/rmsprop_tf.py | csp_resnext50-mish/timm/optim/rmsprop_tf.py | https://arxiv.org/abs/1711.05101 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/sgdp.py | csp_resnext50-mish/timm/optim/sgdp.py | https://github.com/clovaai/AdamP | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://arxiv.org/pdf/2003.13630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception.py | csp_resnext50-mish/timm/models/xception.py | https://github.com/tstandley/Xception-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/drop.py | csp_resnext50-mish/timm/models/layers/drop.py | https://github.com/clovaai/assembled-cnn/blob/master/nets/blocks.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/train.py | csp_resnext50-mish/train-1p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x3.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/mnasnet/mixnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/hfdocs/source/models.mdx | csp_resnext50-mish/timm/models/byobnet.py | https://github.com/idstcv/GPU-Efficient-Networks | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/loss/jsd.py | csp_resnext50-mish/timm/data/auto_augment.py | https://github.com/google-research/augmix/blob/master/imagenet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/rmsprop_tf.py | csp_resnext50-mish/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/master/LICENSE | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/senet.py | csp_resnext50-mish/timm/models/senet.py | https://github.com/hujie-frank/SENet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://arxiv.org/pdf/2002.08258.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/res2net.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_me.py | csp_resnext50-mish/timm/models/layers/activations.py | https://github.com/digantamisra98/H-Mish/blob/0da20d4bc58e696b6803f2523c58d3c8a82782d0/README.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/densenet.py | csp_resnext50-mish/timm/models/densenet.py | https://arxiv.org/pdf/1707.06990.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | https://github.com/gasvn/Res2Net/blob/master/dla.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/adahessian.py | csp_resnext50-mish/timm/optim/adahessian.py | https://github.com/davda54/ada-hessian/blob/master/ada_hessian.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception.py | csp_resnext50-mish/timm/models/xception.py | https://arxiv.org/pdf/1610.02357.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/drop.py | csp_resnext50-mish/timm/models/layers/drop.py | https://github.com/tensorflow/tpu/blob/master/models/official/resnet/resnet_model.py#L74 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/drop.py | csp_resnext50-mish/timm/models/layers/drop.py | https://arxiv.org/pdf/1810.12890.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/inception_v3.py | csp_resnext50-mish/timm/models/inception_v3.py | http://download.tensorflow.org/models/adv_inception_v3_2017_08_18.tar.gz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/scheduler/scheduler.py | csp_resnext50-mish/timm/scheduler/scheduler.py | https://github.com/allenai/allennlp/tree/master/allennlp/training/learning_rate_schedulers | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/densenet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/rmsprop_tf.py | csp_resnext50-mish/timm/optim/rmsprop_tf.py | https://github.com/pytorch/pytorch/blob/063946d2b3f3f1e953a2a3b54e0b34f1393de295/torch/optim/rmsprop.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/radam.py | csp_resnext50-mish/timm/optim/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/loss/jsd.py | csp_resnext50-mish/timm/data/auto_augment.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/split_attn.py | csp_resnext50-mish/timm/models/layers/split_attn.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/random_erasing.py | csp_resnext50-mish/timm/data/random_erasing.py | https://github.com/pytorch/pytorch/issues/19508 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/mixup.py | csp_resnext50-mish/timm/data/mixup.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer_hybrid.py | csp_resnext50-mish/timm/models/resnetv2.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/scheduler/cosine_lr.py | csp_resnext50-mish/timm/scheduler/cosine_lr.py | https://github.com/allenai/allennlp/blob/master/allennlp/training/learning_rate_schedulers/cosine.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/tpu/tree/master/models/official/efficientnet/condconv | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/sknet.py | csp_resnext50-mish/timm/models/layers/selective_kernel.py | https://arxiv.org/abs/1903.06586 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/mixed_conv2d.py | csp_resnext50-mish/timm/models/layers/mixed_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/mnasnet/mixnet/custom_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer/blob/00883dd691c63a6830751563748663526e811cee/vit_jax/checkpoint.py#L224 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dla.py | csp_resnext50-mish/timm/models/dla.py | https://arxiv.org/abs/1707.06484 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/rexnet.py | csp_resnext50-mish/timm/models/rexnet.py | https://github.com/clovaai/rexnet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/model.py | csp_resnext50-mish/timm/models/layers/std_conv.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/split_attn.py | csp_resnext50-mish/sotabench.py | https://github.com/zhanghang1989/ResNeSt | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/agc.py | csp_resnext50-mish/timm/models/nfnet.py | https://arxiv.org/abs/2102.06171 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/cond_conv2d.py | csp_resnext50-mish/timm/models/layers/cond_conv2d.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/condconv/condconv_layers.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/std_conv.py | csp_resnext50-mish/timm/models/layers/std_conv.py | https://arxiv.org/abs/1903.10520v2 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/sknet.py | csp_resnext50-mish/timm/models/sknet.py | https://arxiv.org/abs/2001.06268 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R101x3.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/README.md | csp_resnext50-mish/timm/models/dpn.py | https://github.com/cypw/DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/karpathy/minGPT | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/layers/mixed_conv2d.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnetv2.py | csp_resnext50-mish/timm/models/resnetv2.py | https://arxiv.org/abs/1912.11370 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/dpn.py | csp_resnext50-mish/timm/models/dpn.py | https://github.com/oyam/pytorch-DPNs | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/agc.py | csp_resnext50-mish/timm/utils/agc.py | https://github.com/deepmind/deepmind-research/tree/master/nfnets | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/rmsprop_tf.py | csp_resnext50-mish/timm/optim/rmsprop_tf.py | https://arxiv.org/pdf/1308.0850v5.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_jit.py | csp_resnext50-mish/timm/models/layers/activations.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/hfdocs/source/models.mdx | csp_resnext50-mish/timm/models/resnetv2.py | https://github.com/google-research/big_transfer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/res2net.py | csp_resnext50-mish/timm/models/dla.py | https://arxiv.org/abs/1904.01169 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R152x2.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/model.py | csp_resnext50-mish/timm/models/nfnet.py | https://arxiv.org/abs/2101.08692 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/regnet.py | csp_resnext50-mish/timm/models/regnet.py | https://github.com/facebookresearch/pycls/blob/master/pycls/models/regnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/tresnet.py | csp_resnext50-mish/timm/models/tresnet.py | https://github.com/mrT23/TResNet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/random_erasing.py | csp_resnext50-mish/timm/data/random_erasing.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nadam.py | csp_resnext50-mish/timm/optim/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/xception_aligned.py | csp_resnext50-mish/timm/models/xception_aligned.py | https://github.com/tensorflow/models/blob/master/research/deeplab/g3doc/model_zoo.md | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vgg.py | csp_resnext50-mish/timm/models/vgg.py | https://arxiv.org/pdf/1409.1556.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/train.py | csp_resnext50-mish/train-8p.py | https://github.com/pytorch/examples/tree/master/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/scheduler/cosine_lr.py | csp_resnext50-mish/timm/scheduler/cosine_lr.py | https://arxiv.org/abs/1608.03983 | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-S-R50x1.npz | 下载链接 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/scheduler/scheduler.py | csp_resnext50-mish/timm/scheduler/scheduler.py | https://github.com/pytorch/fairseq/tree/master/fairseq/optim/lr_scheduler | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/activations_jit.py | csp_resnext50-mish/timm/models/layers/activations_jit.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/readers/reader_tfds.py | csp_resnext50-mish/timm/data/parsers/parser_tfds.py | https://pytorch.org/docs/stable/data.html#multi-process-data-loading | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/README.md | csp_resnext50-mish/timm/models/inception_resnet_v2.py | https://arxiv.org/abs/1602.07261 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/rmsprop_tf.py | csp_resnext50-mish/timm/optim/rmsprop_tf.py | http://www.cs.toronto.edu/~tijmen/csc321/slides/lecture_slides_lec6.pdf | 论文地址 | -| 开发引入 | / | csp_resnext50-mish/timm/models/resnet.py | https://pytorch.org/hub/facebookresearch_WSL-Images_resnext/ | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/README.md | csp_resnext50-mish/timm/models/resnet.py | https://arxiv.org/abs/1805.00932 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_blocks.py | csp_resnext50-mish/timm/models/mobilenetv3.py | https://arxiv.org/abs/1905.02244 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/weight_init.py | csp_resnext50-mish/timm/models/layers/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/eca.py | csp_resnext50-mish/timm/models/layers/eca.py | https://arxiv.org/pdf/1910.03151.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/eca.py | csp_resnext50-mish/timm/models/layers/eca.py | https://github.com/pytorch/pytorch/pull/17240 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/mixup.py | csp_resnext50-mish/timm/data/mixup.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/auto_augment.py | csp_resnext50-mish/timm/data/auto_augment.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/hrnet.py | csp_resnext50-mish/timm/models/hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/train.py | csp_resnext50-mish/train-8p.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/optim/nvnovograd.py | csp_resnext50-mish/timm/optim/nvnovograd.py | https://arxiv.org/abs/1905.11286 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_blocks.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1905.11946 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/pdf/1807.11626.pdf | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/metaformer.py | csp_resnext50-mish/timm/models/efficientnet.py | https://arxiv.org/abs/1801.04381 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/auto_augment.py | csp_resnext50-mish/timm/data/auto_augment.py | https://github.com/google-research/augmix | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/auto_augment.py | csp_resnext50-mish/timm/data/auto_augment.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer_hybrid.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/google-research/vision_transformer | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/vision_transformer_sam.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnet.py | csp_resnext50-mish/timm/models/gluon_resnet.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/layers/eca.py | csp_resnext50-mish/timm/models/layers/eca.py | https://github.com/BangguWu/ECANet | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/real_labels.py | csp_resnext50-mish/timm/data/real_labels.py | https://arxiv.org/abs/2006.07159 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/utils/agc.py | csp_resnext50-mish/timm/utils/agc.py | https://gist.github.com/lucidrains/0d6560077edac419ab5d3aa29e674d5c | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/mlp_mixer.py | csp_resnext50-mish/timm/models/vision_transformer.py | https://github.com/facebookresearch/deit | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/loader.py | csp_resnext50-mish/timm/data/loader.py | https://github.com/NVIDIA/apex/commit/d5e2bb4bdeedd27b1dfaf5bb2b24d6c000dee9be#diff-cf86c282ff7fba81fad27a559379d5bf | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/efficientnet.py | csp_resnext50-mish/timm/models/efficientnet.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet_v2.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/loss/jsd.py | csp_resnext50-mish/timm/loss/jsd.py | https://arxiv.org/abs/1912.02781 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/data/auto_augment.py | csp_resnext50-mish/timm/data/auto_augment.py | https://arxiv.org/abs/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/resnest.py | csp_resnext50-mish/timm/models/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/scheduler/tanh_lr.py | csp_resnext50-mish/timm/scheduler/tanh_lr.py | https://arxiv.org/abs/1806.01593 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models/blob/d584e7f617a4d0f1a0b4838227bd1f8852dfa236/timm/models/_efficientnet_builder.py | csp_resnext50-mish/timm/models/efficientnet_builder.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/modeling/backbone/fbnet_builder.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/csp_resnext50-mish/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/classification/pointnetCNN/public_address_statement.md b/PyTorch/contrib/cv/classification/pointnetCNN/public_address_statement.md index 180d5db77da256aeedf20d496742f513a6263b31..a649f124c12903ba5cdcbbef40eabe2ca36c4cc5 100644 --- a/PyTorch/contrib/cv/classification/pointnetCNN/public_address_statement.md +++ b/PyTorch/contrib/cv/classification/pointnetCNN/public_address_statement.md @@ -1,11 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://shapenet.cs.stanford.edu/ericyi/shapenetcore_partanno_v0.zip | 下载数据集 | -| 开发引入 | / | url.ini | https://shapenet.cs.stanford.edu/media/shapenet_part_seg_hdf5_data.zip | 下载数据集 | -| 开发引入 | / | url.ini | https://shapenet.cs.stanford.edu/media/modelnet40_ply_hdf5_2048.zip | 下载数据集 | -| 开发引入 | / | url.ini | https://shapenet.cs.stanford.edu/media/indoor3d_sem_seg_hdf5_data.zip | 下载数据集 | -| 开发引入 | / | pointnetCNN/utils/tf_util.py | http://stackoverflow.com/questions/33949786/how-could-i-use-batch-normalization-in-tensorflow | 相关说明 | -| 开发引入 | / | pointnetCNN/utils/eulerangles.py | http://www.graphicsgems.org/ | 相关说明 | -| 开发引入 | / | pointnetCNN/utils/eulerangles.py | http://mathworld.wolfram.com/EulerParameters.html | 相关说明 | -| 开发引入 | / | pointnetCNN/utils/plyfile.py | http://www.gnu.org/licenses/ | license地址 | -| 开发引入 | / | pointnetCNN/utils/eulerangles.py | http://en.wikipedia.org/wiki/Quaternions#Hamilton_product | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------|------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/pointnetCNN/url.ini | https://shapenet.cs.stanford.edu/ericyi/shapenetcore_partanno_v0.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/pointnetCNN/url.ini | https://shapenet.cs.stanford.edu/media/shapenet_part_seg_hdf5_data.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/pointnetCNN/url.ini | https://shapenet.cs.stanford.edu/media/modelnet40_ply_hdf5_2048.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/classification/pointnetCNN/url.ini | https://shapenet.cs.stanford.edu/media/indoor3d_sem_seg_hdf5_data.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/AdvancedEAST/README.raw.md b/PyTorch/contrib/cv/detection/AdvancedEAST/README.raw.md index 7e0639054a616a8543dd4557f0af691f8931ea4e..38e1a0cfbd18fdd9bdea93319412c2699f4e5dbb 100644 --- a/PyTorch/contrib/cv/detection/AdvancedEAST/README.raw.md +++ b/PyTorch/contrib/cv/detection/AdvancedEAST/README.raw.md @@ -44,7 +44,7 @@ If this project is helpful to you, welcome to star. # Training * tianchi ICPR dataset download -链接: https://pan.baidu.com/s/1NSyc-cHKV3IwDo6qojIrKA 密码: ye9y +用户自行准备数据集 * prepare training data: make data root dir(train_1000), copy images to root dir, and copy txts to root dir, diff --git a/PyTorch/contrib/cv/detection/Cascade_RCNN/public_address_statement.md b/PyTorch/contrib/cv/detection/Cascade_RCNN/public_address_statement.md index 6e1e4b2dcaa1b4456dfdc18667aeb81f5f109f42..3d4bef63c421a7507e0f4bc6cff27551ad03a7ec 100644 --- a/PyTorch/contrib/cv/detection/Cascade_RCNN/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/Cascade_RCNN/public_address_statement.md @@ -1,108 +1,13 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------|------------------------------------------------------------------------------------------------------------------|--------| -| 开发引入 | / | url.ini | https://dl.fbaipublicfiles.com/detectron2 | 下载数据集 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/datasets/prepare_panoptic_fpn.py | Cascade_RCNN/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载数据集 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/checkpoint/catalog.py | Cascade_RCNN/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/checkpoint/catalog.py | Cascade_RCNN/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/model_zoo/model_zoo.py | Cascade_RCNN/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/dev/packaging/build_wheel.sh | Cascade_RCNN/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/dev/packaging/gen_install_table.py | Cascade_RCNN/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Cascade_RCNN/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Cascade_RCNN/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Cascade_RCNN/docker/Dockerfile | https://github.com/facebookresearch/detectron2 | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile | Cascade_RCNN/docker/Dockerfile | https://github.com/facebookresearch/fvcore | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile-circleci | Cascade_RCNN/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docker/Dockerfile-circleci | Cascade_RCNN/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | https://github.com/facebookresearch/detectron2/blob/master/ | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | https://docs.python.org/3.6 | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | https://docs.scipy.org/doc/numpy/ | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | https://pytorch.org/docs/master/ | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | https://arxiv.org/abs/ | 论文链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/setup.py | Cascade_RCNN/setup.py | https://github.com/facebookresearch/detectron2 | 第三包链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/setup.py | Cascade_RCNN/setup.py | https://github.com/psf/black@673327449f86fce558adde153bb6cbe54bfebad2 | 第三包链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Cascade_RCNN/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg | 测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Cascade_RCNN/tests/data/test_coco_evaluation.py | http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg | 测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/data/test_coco_evaluation.py | Cascade_RCNN/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000139.jpg | 测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/tests/test_model_zoo.py | Cascade_RCNN/tests/test_model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/evaluation/sem_seg_evaluation.py | Cascade_RCNN/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 数据集地址 | -| 开发引入 | / | Cascade_RCNN/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/register_coco.py | Cascade_RCNN/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/augmentation_impl.py | Cascade_RCNN/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/projects/DensePose/densepose/data/datasets/coco.py | Cascade_RCNN/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开发引入 | / | Cascade_RCNN/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/poolers.py | Cascade_RCNN/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/transform.py | Cascade_RCNN/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关依赖 | -| 开发引入 | / | Cascade_RCNN/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/transform.py | Cascade_RCNN/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/config/defaults.py | Cascade_RCNN/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/launch.py | Cascade_RCNN/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rpn.py | Cascade_RCNN/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/coco.py | Cascade_RCNN/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/evaluation/pascal_voc_evaluation.py | Cascade_RCNN/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/batch_norm.py | Cascade_RCNN/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/datasets/prepare_cocofied_lvis.py | Cascade_RCNN/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/vision.cpp | Cascade_RCNN/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/vision.cpp | Cascade_RCNN/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/image_list.py | Cascade_RCNN/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/31734 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/modeling/test_matcher.py | Cascade_RCNN/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/pull/38378 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/defaults.py | Cascade_RCNN/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/evaluation/sem_seg_evaluation.py | Cascade_RCNN/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | Cascade_RCNN/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | Cascade_RCNN/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/masks.py | Cascade_RCNN/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/projects/DensePose/densepose/data/datasets/coco.py | Cascade_RCNN/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开发引入 | / | Cascade_RCNN/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | Cascade_RCNN/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/batch_norm.py | Cascade_RCNN/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开发引入 | / | Cascade_RCNN/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/coco.py | Cascade_RCNN/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 数据集地址 | -| 开发引入 | / | Cascade_RCNN/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/structures/test_boxes.py | Cascade_RCNN/tests/structures/test_boxes.py | https://github.com/pytorch/pytorch/pull/39336 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/logger.py | Cascade_RCNN/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rrpn.py | Cascade_RCNN/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/transforms/transform.py | Cascade_RCNN/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/layers/test_roi_align.py | Cascade_RCNN/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | -| 开发引入 | / | Cascade_RCNN/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Cascade_RCNN/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关说明 | -| 开发引入 | / | Cascade_RCNN/docker/Dockerfile | http://images.cocodataset.org/val2017/000000439715.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/dev/packaging/build_wheel.sh | Cascade_RCNN/dev/packaging/build_wheel.sh | https://github.com/NVIDIA/nvidia-docker/issues/854 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/image_list.py | Cascade_RCNN/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/39308 | 相关说明 | -| 开发引入 | / | Cascade_RCNN/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/conf.py | Cascade_RCNN/docs/conf.py | https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/visualizer.py | Cascade_RCNN/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/env.py | Cascade_RCNN/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/tutorials/datasets.md | Cascade_RCNN/detectron2/config/defaults.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Cascade_RCNN/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开发引入 | / | Cascade_RCNN/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Cascade_RCNN/detectron2/data/detection_utils.py | https://www.exiv2.org/tags.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | Cascade_RCNN/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/model_zoo/__init__.py | Cascade_RCNN/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/tutorials/datasets.md | Cascade_RCNN/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/wrappers.py | Cascade_RCNN/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Cascade_RCNN/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/builtin_meta.py | Cascade_RCNN/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/export/shared.py | Cascade_RCNN/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/utils/serialize.py | Cascade_RCNN/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开发引入 | / | Cascade_RCNN/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg","http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg","http://images.cocodataset.org/val2017/000000000139.jpg","http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/wrappers.py | Cascade_RCNN/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/boxes.py | Cascade_RCNN/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/wrappers.py | Cascade_RCNN/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/register_coco.py | Cascade_RCNN/detectron2/data/datasets/register_coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rpn.py | Cascade_RCNN/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/pull/41371 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/config/defaults.py | Cascade_RCNN/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/datasets/lvis.py | Cascade_RCNN/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | Cascade_RCNN/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | Cascade_RCNN/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/modeling/test_matcher.py | Cascade_RCNN/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rpn.py | Cascade_RCNN/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/structures/boxes.py | Cascade_RCNN/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/docs/notes/compatibility.md | Cascade_RCNN/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/data/detection_utils.py | Cascade_RCNN/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 源码实现 | -| 开发引入 | / | Cascade_RCNN/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/launch.py | Cascade_RCNN/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/tests/modeling/test_matcher.py | Cascade_RCNN/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/export/torchscript.py | Cascade_RCNN/detectron2/export/torchscript.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 相关说明 | -| 开发引入 | / | Cascade_RCNN/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/engine/train_loop.py | Cascade_RCNN/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/be792b959bca9af0aacfa04799537856c7a92802/detectron2/modeling/proposal_generator/rrpn.py | Cascade_RCNN/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/Cascade_RCNN/url.ini | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/CascadedMaskRCNN/public_address_statement.md b/PyTorch/contrib/cv/detection/CascadedMaskRCNN/public_address_statement.md index 084d7ea0f66e29b8af72fbf7b6f5ae52d724f5f9..571521d5522d89059f51f48c64c0242687f01b52 100644 --- a/PyTorch/contrib/cv/detection/CascadedMaskRCNN/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/CascadedMaskRCNN/public_address_statement.md @@ -1,108 +1,13 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-----------------------------------------------------------------------------------------------|-----------------------------------------------------|---------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/datasets/prepare_for_tests.sh | CascadedMaskRCNN/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载数据集 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/datasets/prepare_panoptic_fpn.py | CascadedMaskRCNN/datasets/prepare_panoptic_fpn.py | "https://dl.fbaipublicfiles.com/detectron2/ | 下载数据集 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/checkpoint/catalog.py | CascadedMaskRCNN/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/checkpoint/catalog.py | CascadedMaskRCNN/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/model_zoo/model_zoo.py | CascadedMaskRCNN/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/dev/packaging/build_wheel.sh | CascadedMaskRCNN/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/dev/packaging/gen_install_table.py | CascadedMaskRCNN/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docker/Dockerfile | CascadedMaskRCNN/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docker/Dockerfile | CascadedMaskRCNN/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docker/Dockerfile | CascadedMaskRCNN/docker/Dockerfile | https://github.com/facebookresearch/detectron2 | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docker/Dockerfile | CascadedMaskRCNN/docker/Dockerfile | https://github.com/facebookresearch/fvcore | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docker/Dockerfile-circleci | CascadedMaskRCNN/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docker/Dockerfile-circleci | CascadedMaskRCNN/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | https://github.com/facebookresearch/detectron2/blob/master/ | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | https://docs.python.org/3.6 | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | https://docs.scipy.org/doc/numpy/ | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | https://pytorch.org/docs/master/ | 源码链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | https://arxiv.org/abs/ | 论文链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/setup.py | CascadedMaskRCNN/setup.py | https://github.com/facebookresearch/detectron2 | 第三包链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/setup.py | CascadedMaskRCNN/setup.py | https://github.com/psf/black@673327449f86fce558adde153bb6cbe54bfebad2 | 第三包链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/data/test_coco_evaluation.py | CascadedMaskRCNN/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg | 测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/data/test_coco_evaluation.py | CascadedMaskRCNN/tests/data/test_coco_evaluation.py | http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg | 测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/data/test_coco_evaluation.py | CascadedMaskRCNN/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000139.jpg | 测试数据 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/tests/test_model_zoo.py | CascadedMaskRCNN/tests/test_model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载权重文件 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/layers/test_roi_align.py | CascadedMaskRCNN/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/config/defaults.py | CascadedMaskRCNN/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/modeling/proposal_generator/rpn.py | CascadedMaskRCNN/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/batch_norm.py | CascadedMaskRCNN/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 源码实现 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/export/shared.py | CascadedMaskRCNN/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/datasets/prepare_cocofied_lvis.py | CascadedMaskRCNN/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 相关配置 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/tutorials/datasets.md | CascadedMaskRCNN/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/utils/logger.py | CascadedMaskRCNN/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/model_zoo/__init__.py | CascadedMaskRCNN/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/export/torchscript.py | CascadedMaskRCNN/detectron2/export/torchscript.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/datasets/register_coco.py | CascadedMaskRCNN/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/datasets/builtin_meta.py | CascadedMaskRCNN/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/detection_utils.py | CascadedMaskRCNN/detectron2/data/detection_utils.py | https://www.exiv2.org/tags.html | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/structures/image_list.py | CascadedMaskRCNN/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/31734 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/vision.cpp | CascadedMaskRCNN/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/detection_utils.py | CascadedMaskRCNN/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/transforms/augmentation_impl.py | CascadedMaskRCNN/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/modeling/proposal_generator/rpn.py | CascadedMaskRCNN/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/pull/41371 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/modeling/proposal_generator/rrpn.py | CascadedMaskRCNN/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/modeling/proposal_generator/rpn.py | CascadedMaskRCNN/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/structures/test_boxes.py | CascadedMaskRCNN/tests/structures/test_boxes.py | https://github.com/pytorch/pytorch/pull/39336 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | CascadedMaskRCNN/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/notes/compatibility.md | CascadedMaskRCNN/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/evaluation/pascal_voc_evaluation.py | CascadedMaskRCNN/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/structures/boxes.py | CascadedMaskRCNN/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/tutorials/datasets.md | CascadedMaskRCNN/detectron2/config/defaults.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/modeling/test_matcher.py | CascadedMaskRCNN/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/modeling/proposal_generator/rrpn.py | CascadedMaskRCNN/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/engine/launch.py | CascadedMaskRCNN/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/transforms/transform.py | CascadedMaskRCNN/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/engine/defaults.py | CascadedMaskRCNN/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/transforms/transform.py | CascadedMaskRCNN/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/projects/DensePose/densepose/data/datasets/coco.py | CascadedMaskRCNN/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/wrappers.py | CascadedMaskRCNN/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/datasets/coco.py | CascadedMaskRCNN/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/detection_utils.py | CascadedMaskRCNN/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 源码实现 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/modeling/poolers.py | CascadedMaskRCNN/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/dev/packaging/build_wheel.sh | CascadedMaskRCNN/dev/packaging/build_wheel.sh | https://github.com/NVIDIA/nvidia-docker/issues/854 | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/projects/DensePose/densepose/data/datasets/coco.py | CascadedMaskRCNN/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开发引入 | / | CascadedMaskRCNN/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg","http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg","http://images.cocodataset.org/val2017/000000000139.jpg","http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/utils/env.py | CascadedMaskRCNN/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/docker/Dockerfile | http://images.cocodataset.org/val2017/000000439715.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/wrappers.py | CascadedMaskRCNN/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | CascadedMaskRCNN/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/datasets/register_coco.py | CascadedMaskRCNN/detectron2/data/datasets/register_coco.py | http://cocodataset.org/#format-data | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/utils/serialize.py | CascadedMaskRCNN/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/structures/boxes.py | CascadedMaskRCNN/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/docs/conf.py | CascadedMaskRCNN/docs/conf.py | https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/modeling/test_matcher.py | CascadedMaskRCNN/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/evaluation/sem_seg_evaluation.py | CascadedMaskRCNN/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/vision.cpp | CascadedMaskRCNN/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/wrappers.py | CascadedMaskRCNN/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | CascadedMaskRCNN/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/datasets/coco.py | CascadedMaskRCNN/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | CascadedMaskRCNN/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/datasets/lvis.py | CascadedMaskRCNN/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | CascadedMaskRCNN/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/config/defaults.py | CascadedMaskRCNN/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/tests/modeling/test_matcher.py | CascadedMaskRCNN/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/pull/38378 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/engine/train_loop.py | CascadedMaskRCNN/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/layers/batch_norm.py | CascadedMaskRCNN/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/structures/image_list.py | CascadedMaskRCNN/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/39308 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/structures/masks.py | CascadedMaskRCNN/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/detection_utils.py | CascadedMaskRCNN/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/evaluation/sem_seg_evaluation.py | CascadedMaskRCNN/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/detection_utils.py | CascadedMaskRCNN/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/engine/launch.py | CascadedMaskRCNN/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/utils/visualizer.py | CascadedMaskRCNN/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/v0.2.1/detectron2/data/transforms/transform.py | CascadedMaskRCNN/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关说明 | -| 开发引入 | / | CascadedMaskRCNN/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CascadedMaskRCNN/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/CenterFace/README_old.md b/PyTorch/contrib/cv/detection/CenterFace/README_old.md index c86efb8dc922e5558c2fd5b6694cfdafeef65fdd..83f7933e28e94393575a81ff721fd1ee72ec25eb 100755 --- a/PyTorch/contrib/cv/detection/CenterFace/README_old.md +++ b/PyTorch/contrib/cv/detection/CenterFace/README_old.md @@ -15,9 +15,7 @@ conda env create -f enviroment.yaml ``` ## 数据集准备 -1. download the pretrained model from [Baidu](https://pan.baidu.com/s/1sU3pRBTFebbsMDac-1HsQA) password: etdi -2. download the validation set of [WIDER_FACE](https://pan.baidu.com/s/1b5Uku0Bb13Zk9mf7mkZ3FA) password: y4wg -3. the annotation file and train data can download for [Baidu](https://pan.baidu.com/s/1j_2wggZ3bvCuOAfZvjWqTg) password: f9hh +用户自行准备数据集 1)本机解压WIDER_FACE_DATA_ALL.zip文件里面有annotations.zip、labels、WIDER_train.zip、WIDER_val.zip、groud_truth文件。 2)annotations.zip、labels、WIDER_train.zip、WIDER_val.zip复制到服务器的$project/data/wider_face目录下。groud_truth复制到$project下。 diff --git a/PyTorch/contrib/cv/detection/CenterFace/public_address_statement.md b/PyTorch/contrib/cv/detection/CenterFace/public_address_statement.md index 03d4f705ca9d0ff0edaa90400ef8ac55026d0da7..6e0e00f92559a0fb434ffb23c683d466d2493c73 100644 --- a/PyTorch/contrib/cv/detection/CenterFace/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/CenterFace/public_address_statement.md @@ -1,69 +1,41 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------------|-----------------------------------------------------|---------------------------------------------------------------------------------|--| -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/tools/kitti_eval/evaluate_object_3d.cpp | CenterFace/src/tools/kitti_eval/evaluate_object_3d.cpp | http://www.cvlibs.net/datasets/kitti/user_submit_check_login.php?benchmark=object&user=%s&result=%s | 结果存储路径 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/tools/kitti_eval/mail.h | CenterFace/src/tools/kitti_eval/mail.h | noreply@cvlibs.net | 邮箱 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/image.py | CenterFace/src/lib/models/networks/resnet_dcn.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b7-dcc49843.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/Randaugmentations.py | CenterFace/src/lib/utils/Randaugmentations.py | https://github.com/google-research/uda/blob/master/image/randaugment/policies.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b2-8bb594d6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterFace/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | http://docs.nvidia.com/cuda/cublas/#cublas-lt-t-gt-gemm | 相关说明 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/large_hourglass.py | CenterFace/src/lib/models/losses.py | https://github.com/princeton-vl/CornerNet | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterFace/src/lib/models/networks/DCNv2/src/dcn_v2_cuda.c | http://docs.nvidia.com/cuda/cublas/#cublas-lt-t-gt-gemm | 相关说明 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/Randaugmentations.py | CenterFace/src/lib/utils/Randaugmentations.py | https://github.com/quark0/darts/blob/master/cnn/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/image.py | CenterFace/src/lib/utils/image.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/centerface_mobilenet_v2_fpn.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/evaluate/setup.py | CenterFace/evaluate/setup.py | tianhengcheng@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/tools/kitti_eval/evaluate_object_3d.cpp | CenterFace/src/tools/kitti_eval/evaluate_object_3d.cpp | https://github.com/prclibo/kitti_eval | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/Randaugmentations.py | CenterFace/src/lib/utils/Randaugmentations.py | https://github.com/tensorflow/tpu/blob/8462d083dd89489a79e3200bcc8d4063bf362186/models/official/efficientnet/autoaugment.py#L505 | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b5-b6417697.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/centerface_mobilenet_v2_fpn.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/large_hourglass.py | CenterFace/src/lib/models/networks/large_hourglass.py | https://github.com/princeton-vl/CornerNet | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b4-6ed6700e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda_double.h | CenterFace/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda_double.h | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b6-c76e70fd.pth | 预训练模型 | -| 开发引入 | / | CenterFace/src/lib/models/networks/dlav0.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/centerface_mobilenet_v2.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | CenterFace/src/lib/models/networks/pose_dla_dcn.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开发引入 | / | CenterFace/src/lib/models/Backbone/dlav0.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/Randaugmentations.py | CenterFace/src/lib/utils/Randaugmentations.py | https://github.com/rpmcruz/autoaugment/blob/master/transformations.py | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/image.py | CenterFace/src/lib/models/Backbone/msra_resnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b1-f1951068.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/mobilenetv2.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/utils/image.py | CenterFace/src/lib/models/networks/msra_resnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/mobilenetv2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/mobilenet_v2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/pose_higher_hrnet.py | CenterFace/src/lib/models/Backbone/pose_higher_hrnet.py | leoxiaobin@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/pose_higher_hrnet.py | CenterFace/src/lib/models/Backbone/pose_higher_hrnet.py | bcheng9@illinois.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/retinahead.py | CenterFace/src/lib/models/Backbone/efficientdet/retinahead.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/mobilenet_v2.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterFace/src/lib/models/networks/DCNv2/src/dcn_v2_cuda.c | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda_double.h | CenterFace/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda.h | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b0-355c32eb.pth | 预训练模型 | -| 开发引入 | / | CenterFace/src/lib/models/Backbone/pose_dla_dcn.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/efficientdet/utils.py | CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b3-5fb5a3c3.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/evaluate/setup.py | CenterFace/evaluate/evaluation.py | tianhengcheng@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterFace/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/tools/voc_eval_lib/setup.py | CenterFace/src/tools/voc_eval_lib/setup.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/Backbone/mobilenet_v2.py | CenterFace/src/lib/models/Backbone/centerface_mobilenet_v2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/logger.py | CenterFace/src/lib/logger.py | https://gist.github.com/gyglim/1f8dfb1b5c82627ae3efcfbbadb9f514 | 源码实现 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/resnet_dcn.py | CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/chenjun2hao/CenterFace.pytorch/blob/master/src/lib/models/networks/large_hourglass.py | CenterFace/src/lib/models/Backbone/large_hourglass.py | https://github.com/princeton-vl/CornerNet | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/centerface_mobilenet_v2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/centerface_mobilenet_v2_fpn.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/dlav0.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b7-dcc49843.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b6-c76e70fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b5-b6417697.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b4-6ed6700e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b3-5fb5a3c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b2-8bb594d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b1-f1951068.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/efficientdet/utils.py | http://storage.googleapis.com/public-models/efficientnet/efficientnet-b0-355c32eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/mobilenet_v2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/mobilenetv2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/msra_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/pose_dla_dcn.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/Backbone/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/dlav0.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/pose_dla_dcn.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/tools/kitti_eval/evaluate_object_3d.cpp | http://www.cvlibs.net/datasets/kitti/user_submit_check_login.php?benchmark=object&user=%s&result=%s | 邮件地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/tools/kitti_eval/mail.h | noreply@cvlibs.net | 邮箱地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterFace/src/tools/kitti_eval/mail.h | noreply@cvlibs.net | 邮箱地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md b/PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md index 2bbc72c3a625d54c239a9e547d90a824f4cda5fa..98337643e92f4ec59f27c3057d086806f69b6260 100644 --- a/PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/CenterNet/public_address_statement.md @@ -1,35 +1,23 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------------|-------------------------------------------------------|-----------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/get_pascal_voc.sh | CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/get_pascal_voc.sh | CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/get_pascal_voc.sh | CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCdevkit_08-Jun-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/get_pascal_voc.sh | CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/get_pascal_voc.sh | CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCdevkit_18-May-2011.tar | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/get_pascal_voc.sh | CenterNet/src/tools/get_pascal_voc.sh | https://s3.amazonaws.com/images.cocodataset.org/external/external_PASCAL_VOC.zip | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/kitti_eval/evaluate_object_3d.cpp | CenterNet/src/tools/kitti_eval/evaluate_object_3d.cpp | http://www.cvlibs.net/datasets/kitti/user_submit_check_login.php?benchmark=object&user=%s&result=%s | 结果保存的地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/kitti_eval/mail.h | CenterNet/src/tools/kitti_eval/mail.h | noreply@cvlibs.net | 邮箱地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda_double.h | CenterNet/src/lib/models/networks/dcn/modules/deform_conv.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/logger.py | CenterNet/src/lib/logger.py | https://gist.github.com/gyglim/1f8dfb1b5c82627ae3efcfbbadb9f514 | 源码实现 | -| 开发引入 | / | CenterNet/src/lib/models/networks/DCNv2/setup.py | https://github.com/charlesshang/DCNv2 | 源码实现 | -| 开发引入 | / | CenterNet/src/lib/models/networks/pose_dla_dcn.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开发引入 | / | CenterNet/src/lib/models/networks/dcn/modules/deform_conv.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/ops/modulated_deform_conv.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterNet/src/lib/models/networks/DCNv2/src/cpu/dcn_v2_cpu.cpp | http://docs.nvidia.com/cuda/cublas/#cublas-lt-t-gt-gemm | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda_double.h | CenterNet/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda.h | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/kitti_eval/evaluate_object_3d.cpp | CenterNet/src/tools/kitti_eval/evaluate_object_3d.cpp | https://github.com/prclibo/kitti_eval | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开发引入 | / | CenterNet/src/lib/models/networks/dlav0.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开发引入 | / | CenterNet/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_cuda.cu | https://github.com/pytorch/pytorch/blob/master/aten/src/THC/generic/THCTensorMathBlas.cu | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterNet/src/lib/models/networks/DCNv2/src/cpu/dcn_v2_cpu.cpp | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/tools/voc_eval_lib/setup.py | CenterNet/src/tools/voc_eval_lib/setup.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_im2col_cuda_double.h | CenterNet/src/lib/models/networks/DCNv2/src/cpu/dcn_v2_im2col_cpu.h | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/utils/image.py | CenterNet/src/lib/utils/image.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/DCNv2/src/dcn_v2_cuda_double.c | CenterNet/src/lib/models/networks/DCNv2/src/cuda/dcn_v2_cuda.cu | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet/blob/master/src/lib/models/networks/resnet_dcn.py | CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/dlav0.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/msra_resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/pose_dla_dcn.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/get_pascal_voc.sh | https://s3.amazonaws.com/images.cocodataset.org/external/external_PASCAL_VOC.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCdevkit_18-May-2011.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/get_pascal_voc.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCdevkit_08-Jun-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/kitti_eval/evaluate_object_3d.cpp | http://www.cvlibs.net/datasets/kitti/user_submit_check_login.php?benchmark=object&user=%s&result=%s | 邮件地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/kitti_eval/mail.h | noreply@cvlibs.net | 邮箱地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/CenterNet/src/tools/kitti_eval/mail.h | noreply@cvlibs.net | 邮箱地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/DETR/public_address_statement.md b/PyTorch/contrib/cv/detection/DETR/public_address_statement.md index 69a3c4855af7cb9b0e416d667956787df5508441..76d041e03877c34c5083d9b0d367f903a6a33009 100644 --- a/PyTorch/contrib/cv/detection/DETR/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/DETR/public_address_statement.md @@ -1,17 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开源代码引入 | https://github.com/facebookresearch/detr/hubconf.py | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r101-panoptic-40021d53.pth | 预训练模型 | -| 开发引入 | / | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-e632da11.pth | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/detr/models/segmentation.py | DETR/models/segmentation.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detr/datasets/coco_eval.py | DETR/datasets/coco_eval.py | https://github.com/pytorch/vision/blob/edfd5a7/references/detection/coco_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detr/datasets/coco.py | DETR/datasets/coco.py | https://github.com/pytorch/vision/blob/13b35ff/references/detection/coco_utils.py | 源码实现 | -| 开发引入 | / | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-dc5-f0fb7ef5.pth | 预训练模型 | -| 开发引入 | / | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r101-dc5-a2e86def.pth | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/detr/models/detr.py | DETR/models/detr.py | https://github.com/facebookresearch/detr/issues/108#issuecomment-650269223 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detr/hubconf.py | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-dc5-panoptic-da08f1b1.pth | 预训练模型 | -| 开发引入 | / | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r101-2c7b67e5.pth | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/detr/util/box_ops.py | DETR/util/box_ops.py | https://giou.stanford.edu/ | 相关链接 | -| 开源代码引入 | https://github.com/facebookresearch/detr/hubconf.py | DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-panoptic-00ce5173.pth | 预训练模型 | -| 开发引入 | / | detection/DETR/1.5_requirements.txt | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI&egg=pycocotools | 相关依赖 | -| 开发引入 | / | detection/DETR/1.8_requirements.txt | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI&egg=pycocotools | 相关依赖 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------|------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-panoptic-00ce5173.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-e632da11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-dc5-panoptic-da08f1b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r50-dc5-f0fb7ef5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r101-panoptic-40021d53.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r101-dc5-a2e86def.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/DETR/hubconf.py | https://dl.fbaipublicfiles.com/detr/detr-r101-2c7b67e5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/FAN/public_address_statement.md b/PyTorch/contrib/cv/detection/FAN/public_address_statement.md index b706c7a357a5a75ad47be261a507a13dfd5c08ce..c0fef8b69e6330bb215a3d666dfaf429734e5d97 100644 --- a/PyTorch/contrib/cv/detection/FAN/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/FAN/public_address_statement.md @@ -1,26 +1,16 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/detection/blazeface/blazeface_detector.py | FAN/face_alignment/detection/blazeface/blazeface_detector.py | https://github.com/hollance/BlazeFace-PyTorch/blob/master/blazeface.pth?raw=true | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/setup.py | FAN/setup.py | https://github.com/pytorch/vision/blob/master/setup.py | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/setup.py | FAN/face_alignment/__init__.py | adrian@adrianbulat.com | 邮箱地址 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/setup.py | FAN/setup.py | https://github.com/1adrianb/face-alignment | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/setup.py | FAN/setup.py | adrian@adrianbulat.com | 邮箱地址 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/Dockerfile | FAN/Dockerfile | https://github.com/pytorch/pytorch/blob/master/Dockerfile | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/3DFAN4_1.5-176570af4d.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/detection/dlib/dlib_detector.py | FAN/face_alignment/detection/dlib/dlib_detector.py | https://www.adrianbulat.com/downloads/dlib/mmod_human_face_detector.dat | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/detection/blazeface/blazeface_detector.py | FAN/face_alignment/detection/blazeface/blazeface_detector.py | https://github.com/hollance/BlazeFace-PyTorch/blob/master/anchors.npy?raw=true | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/depth-6c4283c0e0.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/depth_1.6-2aa3f18772.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/setup.py | FAN/Dockerfile | https://github.com/1adrianb/face-alignment | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/detection/sfd/sfd_detector.py | FAN/face_alignment/detection/sfd/sfd_detector.py | https://www.adrianbulat.com/downloads/python-fan/s3fd-619a316812.pth | 预训练模型 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/2DFAN4_1.5-a60332318a.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/2DFAN4-cd938726ad.zip | 下载链接 | -| 开发引入 | / | FAN/face_alignment/detection/blazeface/net_blazeface.py | https://github.com/tkat0/PyTorch_BlazeFace/ | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/3DFAN4-4a694010b9.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/2DFAN4_1.6-c827573f02.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/Dockerfile | FAN/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/detection/blazeface/net_blazeface.py | FAN/face_alignment/detection/blazeface/net_blazeface.py | https://github.com/google/mediapipe/ | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/3DFAN4_1.6-ec5cf40a1d.zip | 下载链接 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/detection/blazeface/net_blazeface.py | FAN/face_alignment/detection/blazeface/net_blazeface.py | https://github.com/amdegroot/ssd.pytorch/blob/master/layers/box_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/1adrianb/face-alignment/face_alignment/api.py | FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/depth_1.5-bc10f98e39.zip | 下载链接 | -| 开发引入 | / | FAN/face_alignment/FAN.py | https://github.com/1adrianb/face-alignment/blob/master/face_alignment/models.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/__init__.py | adrian@adrianbulat.com | 邮箱地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/depth-6c4283c0e0.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/3DFAN4-4a694010b9.zip | 下载链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/2DFAN4-cd938726ad.zip | 下载链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/depth_1.6-2aa3f18772.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/depth_1.5-bc10f98e39.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/3DFAN4_1.6-ec5cf40a1d.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/3DFAN4_1.5-176570af4d.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/2DFAN4_1.6-c827573f02.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/api.py | https://www.adrianbulat.com/downloads/python-fan/2DFAN4_1.5-a60332318a.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/detection/dlib/dlib_detector.py | https://www.adrianbulat.com/downloads/dlib/mmod_human_face_detector.dat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/face_alignment/detection/sfd/sfd_detector.py | https://www.adrianbulat.com/downloads/python-fan/s3fd-619a316812.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FAN/setup.py | adrian@adrianbulat.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/FCOS/public_address_statement.md b/PyTorch/contrib/cv/detection/FCOS/public_address_statement.md index 94eb66eccbc8ffbc1a592821738e62b27b08606c..74edc2537b8a7accc1bb995e818f18d2f2550490 100644 --- a/PyTorch/contrib/cv/detection/FCOS/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/FCOS/public_address_statement.md @@ -1,152 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/demodata.py| FCOS/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/demodata.py| FCOS/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/yolact/README.md| FCOS/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/pipelines/auto_augment.py| FCOS/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/instaboost/README.md| FCOS/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/instaboost/README.md| FCOS/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/mask/structures.py| FCOS/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/mask/structures.py| FCOS/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/detectors/README.md| FCOS/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/backbones/hourglass.py| FCOS/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/backbones/hrnet.py| FCOS/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/robustness_benchmarking.md| FCOS/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/pipelines/transforms.py| FCOS/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/backbones/resnet.py| FCOS/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/pipelines/transforms.py| FCOS/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/corner_head.py| FCOS/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py.myy | https://arxiv.org/abs/1904.01355 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py.myy | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py.myy | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/foveabox/README.md| FCOS/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py.myy | https://arxiv.org/abs/1904.01355 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py.myy | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/dense_heads/fcos_head.py.myy | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/gfl/README.md| FCOS/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/guided_anchoring/README.md| FCOS/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/paa_head.py| FCOS/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/paa_head.py| FCOS/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/paa_head.py| FCOS/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/retina_head.py| FCOS/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/sabl/README.md| FCOS/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/ssd_head.py| FCOS/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/vfnet/README.md| FCOS/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/yolact/README.md| FCOS/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/yolact/README.md| FCOS/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/cascade_rcnn.py| FCOS/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/yolact/README.md| FCOS/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/faster_rcnn.py| FCOS/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/fast_rcnn.py| FCOS/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/dense_heads/fcos_head.py| FCOS/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/foveabox/README.md| FCOS/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/grid_rcnn.py| FCOS/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/grid_rcnn.py| FCOS/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/htc/README.md| FCOS/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/cityscapes/README.md| FCOS/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/mask_scoring_rcnn.py| FCOS/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/paa.py| FCOS/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/reppoints_detector.py| FCOS/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/retinanet.py| FCOS/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/vfnet/README.md| FCOS/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/yolact/README.md| FCOS/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/libra_rcnn/README.md| FCOS/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/libra_rcnn/README.md| FCOS/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/ae_loss.py| FCOS/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/ae_loss.py| FCOS/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/retinanet.py| FCOS/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/retinanet.py| FCOS/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/gaussian_focal_loss.py| FCOS/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/gfl/README.md| FCOS/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/retinanet.py| FCOS/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/gfl/README.md| FCOS/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/gfl/README.md| FCOS/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/ghm_loss.py| FCOS/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/retinanet.py| FCOS/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/iou_loss.py| FCOS/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/gfl/README.md| FCOS/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/iou_loss.py| FCOS/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/ghm_loss.py| FCOS/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/iou_loss.py| FCOS/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/losses/iou_loss.py| FCOS/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/vfnet/README.md| FCOS/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/vfnet/README.md| FCOS/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/necks/fpn.py| FCOS/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/necks/bfp.py| FCOS/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/carafe/README.md| FCOS/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/backbones/hrnet.py| FCOS/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/necks/nas_fpn.py| FCOS/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/detectors/README.md| FCOS/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/detectors/README.md| FCOS/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/roi_heads/cascade_roi_head.py| FCOS/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/tutorials/customize_models.md| FCOS/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/grid_rcnn.py| FCOS/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/htc/README.md| FCOS/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/mask_scoring_rcnn.py| FCOS/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/roi_heads/mask_heads/mask_point_head.py| FCOS/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/utils/gaussian_target.py| FCOS/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/assigners/atss_assigner.py| FCOS/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/sabl/README.md| FCOS/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/coder/delta_xywh_bbox_coder.py| FCOS/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/coder/delta_xywh_bbox_coder.py| FCOS/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/coder/delta_xywh_bbox_coder.py| FCOS/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/coder/delta_xywh_bbox_coder.py| FCOS/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/coder/yolo_bbox_coder.py| FCOS/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/libra_rcnn/README.md| FCOS/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/structures/bbox/samplers/ohem_sampler.py| FCOS/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/sabl/README.md| FCOS/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/tridentnet/README.md| FCOS/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/grid_rcnn.py| FCOS/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/roi_heads/mask_heads/mask_point_head.py| FCOS/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/detectors/grid_rcnn.py| FCOS/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/models/necks/fpn.py| FCOS/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/.github/workflows/build_pat.yml| FCOS/Dockerfile | https://github.com/open-mmlab/mmcv.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/setup.py| FCOS/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/setup.py| FCOS/setup.py | openmmlab@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/albu_example/README.md| FCOS/setup.py | https://github.com/open-mmlab/mmdetection | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docker/Dockerfile| FCOS/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/demo/MMDet_Tutorial.ipynb| FCOS/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/conf.py| FCOS/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/make.bat| FCOS/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/deepfashion/README.md| FCOS/docs/stat.py | https://github.com/open-mmlab/mmdetection/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/legacy_1.x/README.md| FCOS/tests/async_benchmark.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/builder.py| FCOS/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/cityscapes.py| FCOS/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/cityscapes.py| FCOS/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/dataset_wrappers.py| FCOS/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/docs/changelog.md| FCOS/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/wider_face/README.md| FCOS/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/tridentnet/README.md| FCOS/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/utils/util_mixins.py| FCOS/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/configs/tridentnet/README.md| FCOS/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/lvis.py| FCOS/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 相关数据集图片地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/main/mmdet/datasets/lvis.py| FCOS/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FCOS/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FCOS/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FCOS/configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FCOS/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | mmcv地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FCOS/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FCOS/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/public_address_statement.md index a8fa8c56f4fba8fa4cc756a7005fa7e6613c332f..91b73b7d78a7e6239c014f1e17dfad22b22707cf 100644 --- a/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/public_address_statement.md @@ -1,308 +1,85 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------|------------------------------------------------------| ------------------------------------ |---------| -| 开发引入 | / | FSAF_for_Pytorch/Dockerfile | https://github.com/open-mmlab/mmcv.git | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 下载预训练权重 | -| 开发引入 | / | FSAF_for_Pytorch/url.ini | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 下载预训练权重 | -| 开发引入 | / | FSAF_for_Pytorch/url.ini | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 下载预训练权重 | -| 开发引入 | / | FSAF_for_Pytorch/url.ini | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmdetection/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmdetection/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 下载依赖 | -| 开发引入 | / | FSAF_for_Pytorch/url.ini | https://github.com/open-mmlab/mmdetection/blob/master/ | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | FSAF_for_Pytorch/mmdetection/setup.py | openmmlab@gmail.com | 邮箱 | -| 开发引入 | / | FSAF_for_Pytorch/url.ini | https://github.com/open-mmlab/mmdetection | 下载依赖 | -| 开发引入 | / | FSAF_for_Pytorch/url.ini | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | FSAF_for_Pytorch/mmdetection/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch/blob/75ee5756715e7161314ce037474843b68f69fc04/torch/onnx/symbolic_helper.py#L375 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/scnet_semantic_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/feature_relay_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/cascade_rcnn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/box_iou_rotated_utils.hpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/scnet.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/ohem_sampler.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/nasfcos.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/modulated_deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/carafe_cuda_kernel.cuh | https://devblogs.nvidia.com/efficient-matrix-transpose-cuda-cc/ | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/sampling_result.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/global_context_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/embedding_rpn_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/iou_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/resnet.py | FSAF_for_Pytorch/mmdetection/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | FSAF_for_Pytorch/mmdetection/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/utils/res_layer.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/assigners/atss_assigner.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/notes/compatibility.md | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/box_iou_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/user_guides/useful_tools.md | FSAF_for_Pytorch/mmcv_need/cnn/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/regnet.py | FSAF_for_Pytorch/mmdetection/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | FSAF_for_Pytorch/mmdetection/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/gcnet/README.md | FSAF_for_Pytorch/mmcv_need/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/bbox_heads/dii_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/conf.py | FSAF_for_Pytorch/mmdetection/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/mask_rcnn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/retina_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | FSAF_for_Pytorch/mmcv_need/ops/saconv.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/bbox_heads/scnet_bbox_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/yolo.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | FSAF_for_Pytorch/mmdetection/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/utils/util_mixins.py | FSAF_for_Pytorch/mmdetection/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/nms_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/cascade_rpn_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/cascade_rpn_head.py | https://arxiv.org/abs/1909.06720 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/psamask.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/utils/gaussian_target.py | FSAF_for_Pytorch/mmdetection/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/guided_anchoring/README.md | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | FSAF_for_Pytorch/mmdetection/mmdet/core/mask/structures.py | https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.truncnorm.html | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/dcn/README.md | FSAF_for_Pytorch/mmcv_need/ops/csrc/modulated_deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/simplify/core.py | https://github.com/onnx/onnx/blob/e5e9a539f550f07ec156812484e8d4f33fb91f88/onnx/onnx.proto#L461 | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2613 | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/trident_faster_rcnn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/trident_faster_rcnn.py | https://arxiv.org/abs/1901.01892 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/pafpn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/libra_rcnn/README.md | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/fpg.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/fpg.py | https://arxiv.org/abs/2004.03580 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/notes/compatibility.md | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/corner_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/hourglass.py | FSAF_for_Pytorch/mmdetection/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | FSAF_for_Pytorch/mmcv_need/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/double_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | FSAF_for_Pytorch/mmdetection/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/reppoints_detector.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | FSAF_for_Pytorch/mmcv_need/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn","sjqian@cse.cuhk.edu.hk","yuliu@ee.cuhk.edu.hk | 邮箱地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/core/visualization/image.py | https://github.com/opencv/opencv-python/issues/46 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/sparse_rcnn.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2417 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/parrots/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | FSAF_for_Pytorch/mmcv_need/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/wider_face.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | FSAF_for_Pytorch/mmdetection/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/free_anchor_retina_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/make.bat | FSAF_for_Pytorch/mmdetection/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/augment_wrappers.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/instaboost.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/simplify/core.py | https://github.com/daquexian/onnx-simplifier | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | FSAF_for_Pytorch/mmcv_need/ops/roi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | FSAF_for_Pytorch/mmdetection/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/cascade_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/transformer_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/video/io.py | https://","http:// | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/setup.py | FSAF_for_Pytorch/mmdetection/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/nms.py | https://github.com/pytorch/vision/blob | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/paa.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_heads/scnet_mask_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nas_fpn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/instaboost/README.md | FSAF_for_Pytorch/mmdetection/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fast_rcnn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/core/visualization/image.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/nasfcos_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/nms_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/ssd_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/dynamic_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/transformer/utils.py | FSAF_for_Pytorch/mmdetection/mmdet/models/utils/transformer.py | https://github.com/PeizeSun/ | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | FSAF_for_Pytorch/mmdetection/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/1709283/how-can-i-sort-a-coordinate-list-for-a-rectangle-counterclockwise | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/dcn/README.md | FSAF_for_Pytorch/mmcv_need/ops/csrc/deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/centripetal_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/onnx/symbolic.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/core/mask/structures.py | https://gitlab.kitware.com/computer-vision/kwimage/-/blob/928cae35ca8/kwimage/structs/polygon.py#L379 | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | FSAF_for_Pytorch/mmdetection/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 相关说明 | -| 开发引入 | / | FSAF_for_Pytorch/mmdetection/mmdet/utils/util_random.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开发引入 | / | FSAF_for_Pytorch/mmcv_need/ops/csrc/pytorch/cc_attention_cuda.cu | https://github.com/LikeLy-Journey/SegmenTron/blob/master/segmentron/modules/csrc/criss_cross_attention/ca_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/ghm_loss.py | FSAF_for_Pytorch/mmdetection/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/faster_rcnn.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | FSAF_for_Pytorch/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | mmedit/mobilenet_v2": "https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmcv_need/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmdetection/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | mmcv地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmdetection/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/mmdetection/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/url.ini | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/url.ini | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/url.ini | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FSAF_for_Pytorch/url.ini | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/FaceBoxes/public_address_statement.md b/PyTorch/contrib/cv/detection/FaceBoxes/public_address_statement.md index 1f023cfdb62d836981d0fcbefb908bcc7445c708..8fdf9ed5fe201dbb1a64e85a3d89d343e52be20b 100644 --- a/PyTorch/contrib/cv/detection/FaceBoxes/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/FaceBoxes/public_address_statement.md @@ -1,8 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开源代码引入 | https://github.com/zisianw/FaceBoxes.PyTorch/utils/box_utils.py | FaceBoxes/utils/box_utils.py | https://github.com/Hakuyume/chainer-ssd | 源码实现 | -| 开源代码引入 | https://github.com/zisianw/FaceBoxes.PyTorch/utils/box_utils.py | FaceBoxes/utils/box_utils.py | https://github.com/fmassa/object-detection.torch | 源码实现 | -| 开源代码引入 | https://github.com/zisianw/FaceBoxes.PyTorch/utils/build.py | FaceBoxes/utils/build.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 相关说明 | -| 开发引入 | / | FaceBoxes/layers/modules/multibox_loss.py | https://arxiv.org/pdf/1512.02325.pdf | 论文地址 | -| 开源代码引入 | https://github.com/zisianw/FaceBoxes.PyTorch/train.py | FaceBoxes/train.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py | 源码实现 | -| 开源代码引入 | https://github.com/Levi0223/FDDB_Evaluation/evaluate.py | FaceBoxes/evaluate.py | https://github.com/Levi0223/FDDB_Evaluation | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FaceBoxes/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/FairMOT/public_address_statement.md b/PyTorch/contrib/cv/detection/FairMOT/public_address_statement.md index a623144a45395db66d0fe7ad0f2f0aa752951e44..575fc8c8750587111dac407ac6801dc4ba10aa4a 100644 --- a/PyTorch/contrib/cv/detection/FairMOT/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/FairMOT/public_address_statement.md @@ -1,33 +1,15 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/losses.py|FairMOT/src/lib/models/losses.py | https://github.com/princeton-vl/CornerNet | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/common.py|FairMOT/src/lib/models/common.py | https://arxiv.org/abs/2010.11929 | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/common.py|FairMOT/src/lib/models/common.py | https://arxiv.org/abs/2010.11929 | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/common.py|FairMOT/src/lib/models/common.py | https://github.com/WongKinYiu/CrossStagePartialNetworks | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/losses.py|FairMOT/src/lib/models/losses.py | https://github.com/Cysu/open-reid/blob/master/reid/loss/triplet.py | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/yolo.py|FairMOT/src/lib/models/yolo.py | https://github.com/ultralytics/yolov5/pull/2953 | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/tracking_utils/evaluation.py|FairMOT/src/lib/tracking_utils/evaluation.py | https://github.com/longcw/py-motmetrics | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/tracking_utils/utils.py|FairMOT/src/lib/tracking_utils/utils.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/tracking_utils/utils.py|FairMOT/src/lib/tracking_utils/utils.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/utils/image.py | Bin.Xiao@microsoft.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/tracking_utils/utils.py|FairMOT/src/lib/tracking_utils/utils.py | https://storage.googleapis.com/ultralytics/yolov3/results_v1.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/tracking_utils/utils.py|FairMOT/src/lib/utils/utils.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/tracking_utils/utils.py|FairMOT/src/lib/utils/utils.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/datasets/dataset/jde.py|FairMOT/src/lib/datasets/dataset/jde.py | https://medium.com/uruvideo/dataset-augmentation-with-random-homographies-a8f4b44830d4 | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/dlav0.py|FairMOT/src/lib/models/networks/dlav0.py | http://dl.yf.io/dla/models | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/dlav0.py|FairMOT/src/lib/models/networks/pose_dla_conv.py | http://dl.yf.io/dla/models | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/datasets/dataset/jde_yolov5.py|FairMOT/src/lib/datasets/dataset/jde_yolov5.py | https://github.com/ultralytics/yolov3/issues/232 | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/datasets/dataset/jde.py|FairMOT/src/lib/datasets/dataset/jde_yolov5.py | https://medium.com/uruvideo/dataset-augmentation-with-random-homographies-a8f4b44830d4 | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/dlav0.py|FairMOT/src/lib/models/networks/pose_dla_dcn.py | http://dl.yf.io/dla/models | 模型相关说明 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_dcn.py | Bin.Xiao@microsoft.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | Bin.Xiao@microsoft.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开源代码引入 | https://github.com/ifzhang/FairMOT/blob/master/src/lib/models/networks/resnet_dcn.py|FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/dlav0.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/pose_dla_conv.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/pose_dla_dcn.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/FairMOT/src/lib/models/networks/resnet_fpn_dcn.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/GCNet/public_address_statement.md b/PyTorch/contrib/cv/detection/GCNet/public_address_statement.md index 1a611f77f2c3501fb48c702dfa2a5eaaae34f6e7..758941dc18eff6c336590878ad66b5b58dd0b3a1 100644 --- a/PyTorch/contrib/cv/detection/GCNet/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/GCNet/public_address_statement.md @@ -1,155 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|------------------------------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GCNet/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GCNet/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | GCNet/dependency/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | GCNet/dependency/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | GCNet/dependency/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/1709283/how-can-i-sort-a-coordinate-list-for-a-rectangle-counterclockwise | 相关说明 | -| 开发引入 | / | GCNet/test/.vs/test/v16/.suo | 16.0.0.0 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | GCNet/dependency/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/fpg.py | GCNet/dependency/mmdet/models/necks/fpg.py | https://arxiv.org/abs/2004.03580 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | GCNet/dependency/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GCNet/dependency/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | GCNet/dependency/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/nasfcos_head.py | GCNet/dependency/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | GCNet/dependency/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GCNet/dependency/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GCNet/dependency/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | GCNet/dependency/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | GCNet/dependency/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | GCNet/dependency/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/retina_head.py | GCNet/dependency/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | GCNet/dependency/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | GCNet/dependency/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/trident_faster_rcnn.py | GCNet/dependency/mmdet/models/detectors/trident_faster_rcnn.py | https://arxiv.org/abs/1901.01892 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/scnet_roi_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | GCNet/dependency/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nas_fpn.py | GCNet/dependency/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/sparse_roi_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/hourglass.py | GCNet/dependency/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GCNet/dependency/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GCNet/dependency/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | GCNet/dependency/mmdet/core/visualization/image.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/mask_heads/scnet_semantic_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GCNet/dependency/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/augment_wrappers.py | GCNet/dependency/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/transformer/utils.py | GCNet/dependency/mmdet/models/utils/transformer.py | https://github.com/PeizeSun/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/faster_rcnn.py | GCNet/dependency/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | GCNet/dependency/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | GCNet/dependency/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/gcnet/README.md | GCNet/dependency/mmcv/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | GCNet/dependency/mmdet/models/dense_heads/embedding_rpn_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | GCNet/dependency/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | GCNet/dependency/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GCNet/dependency/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/yolo.py | GCNet/dependency/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | GCNet/dependency/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | GCNet/dependency/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | GCNet/dependency/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GCNet/dependency/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | GCNet/dependency/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/dynamic_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/pafpn.py | GCNet/dependency/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | GCNet/dependency/mmdet/models/dense_heads/transformer_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | GCNet/dependency/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | GCNet/dependency/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/mask_heads/scnet_mask_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/cascade_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/detectors/scnet.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/sampling_result.py | GCNet/dependency/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/utils/res_layer.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | GCNet/dependency/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/ghm_loss.py | GCNet/dependency/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GCNet/dependency/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/ssd_head.py | GCNet/dependency/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/cascade_rpn_head.py | GCNet/dependency/mmdet/models/dense_heads/cascade_rpn_head.py | https://arxiv.org/abs/1909.06720 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | GCNet/dependency/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/mask_heads/global_context_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/core/visualization/image.py | https://github.com/opencv/opencv-python/issues/46 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | GCNet/dependency/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | GCNet/dependency/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | GCNet/dependency/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | GCNet/dependency/mmdet/core/mask/structures.py | https://docs.scipy.org/doc/scipy/reference/generated/scipy.stats.truncnorm.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GCNet/dependency/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | GCNet/dependency/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GCNet/dependency/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/core/post_processing/bbox_nms.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/ohem_sampler.py | GCNet/dependency/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | GCNet/dependency/mmdet/models/detectors/sparse_rcnn.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/mask_rcnn.py | GCNet/dependency/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | GCNet/dependency/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/mask_heads/feature_relay_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | GCNet/dependency/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GCNet/dependency/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开发引入 | / | GCNet/dependency/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | GCNet/dependency/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GCNet/dependency/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/corner_head.py | GCNet/dependency/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | GCNet/dependency/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | GCNet/dependency/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/resnet.py | GCNet/dependency/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/iou_loss.py | GCNet/dependency/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GCNet/dependency/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/bbox_heads/scnet_bbox_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | GCNet/dependency/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/centripetal_head.py | GCNet/dependency/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GCNet/dependency/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | GCNet/dependency/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | GCNet/dependency/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | GCNet/dependency/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/free_anchor_retina_head.py | GCNet/dependency/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | GCNet/dependency/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/regnet.py | GCNet/dependency/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | GCNet/dependency/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | GCNet/dependency/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | GCNet/dependency/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | GCNet/dependency/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | GCNet/dependency/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | GCNet/dependency/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | GCNet/dependency/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开发引入 | / | GCNet/dependency/mmdet/core/mask/structures.py | https://gitlab.kitware.com/computer-vision/kwimage/-/blob/928cae35ca8/kwimage/structs/polygon.py#L379 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/reppoints_detector.py | GCNet/dependency/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | GCNet/dependency/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/wider_face.py | GCNet/dependency/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开发引入 | / | GCNet/dependency/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | GCNet/dependency/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GCNet/dependency/mmcv/roi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | GCNet/dependency/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开发引入 | / | GCNet/dependency/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开发引入 | / | GCNet/dependency/mmcv/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/double_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | GCNet/dependency/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/assigners/atss_assigner.py | GCNet/dependency/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fast_rcnn.py | GCNet/dependency/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GCNet/dependency/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | GCNet/dependency/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/utils/util_random.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/utils/gaussian_target.py | GCNet/dependency/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | GCNet/dependency/mmdet/models/roi_heads/bbox_heads/dii_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/cascade_rcnn.py | GCNet/dependency/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | GCNet/dependency/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | GCNet/dependency/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | GCNet/dependency/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmcv/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/paa.py | GCNet/dependency/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/nasfcos.py | GCNet/dependency/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | GCNet/dependency/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/instaboost.py | GCNet/dependency/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开发引入 | / | GCNet/dependency/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | GCNet/dependency/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/libra_rcnn/README.md | GCNet/dependency/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/guided_anchoring/README.md | GCNet/dependency/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/instaboost/README.md | GCNet/dependency/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | GCNet/dependency/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | GCNet/dependency/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/utils/util_mixins.py | GCNet/dependency/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|--------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GCNet/dependency/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GCNet/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | mmcv地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/GFocalV2/public_address_statement.md b/PyTorch/contrib/cv/detection/GFocalV2/public_address_statement.md index 9839a6c8025ba3e1805d4f3add6900accc96bed0..822f5b72e8ba581dbfb7aef9ccda1e410d4990dd 100644 --- a/PyTorch/contrib/cv/detection/GFocalV2/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/GFocalV2/public_address_statement.md @@ -1,135 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|-------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GCNet/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GCNet/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GFocalV2/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GFocalV2/setup.py | openmmlab@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GFocalV2/setup.py | https://github.com/open-mmlab/mmdetection | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection | GFocalV2/tests/async_benchmark.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/mask_rcnn_r50_fpn_1x_20181010-069fa190.pth | 下载预训练权重 | -| 开发引入 | / | GFocalV2/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/pafpn.py | GFocalV2/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | GFocalV2/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GFocalV2/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/sampling_result.py | GFocalV2/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | GFocalV2/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | GFocalV2/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/conf.py | GFocalV2/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | GFocalV2/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | GFocalV2/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | GFocalV2/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/libra_rcnn/README.md | GFocalV2/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/wider_face.py | GFocalV2/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | GFocalV2/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | GFocalV2/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fast_rcnn.py | GFocalV2/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/nasfcos_head.py | GFocalV2/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/centripetal_head.py | GFocalV2/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | GFocalV2/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | GFocalV2/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | GFocalV2/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/cascade_roi_head.py | GFocalV2/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | GFocalV2/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/ssd_head.py | GFocalV2/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | GFocalV2/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/ghm_loss.py | GFocalV2/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | GFocalV2/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | GFocalV2/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | GFocalV2/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | GFocalV2/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/reppoints_detector.py | GFocalV2/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | GFocalV2/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | GFocalV2/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | GFocalV2/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GFocalV2/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GFocalV2/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/hourglass.py | GFocalV2/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | GFocalV2/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | GFocalV2/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/double_roi_head.py | GFocalV2/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开发引入 | / | GFocalV2/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/resnet.py | GFocalV2/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/ohem_sampler.py | GFocalV2/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | GFocalV2/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/guided_anchoring/README.md | GFocalV2/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | GFocalV2/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/regnet.py | GFocalV2/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开发引入 | / | GFocalV2/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | GFocalV2/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/paa.py | GFocalV2/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | GFocalV2/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | GFocalV2/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | GFocalV2/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | GFocalV2/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | GFocalV2/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | GFocalV2/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开发引入 | / | GFocalV2/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开发引入 | / | GFocalV2/configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | GFocalV2/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | GFocalV2/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | GFocalV2/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GFocalV2/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | GFocalV2/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | GFocalV2/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/assigners/atss_assigner.py | GFocalV2/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | GFocalV2/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nas_fpn.py | GFocalV2/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GFocalV2/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | GFocalV2/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | GFocalV2/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GFocalV2/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GFocalV2/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/corner_head.py | GFocalV2/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/free_anchor_retina_head.py | GFocalV2/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GFocalV2/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/instaboost.py | GFocalV2/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GFocalV2/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | GFocalV2/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/mask_rcnn.py | GFocalV2/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | GFocalV2/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | GFocalV2/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | GFocalV2/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/dynamic_roi_head.py | GFocalV2/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/nasfcos.py | GFocalV2/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | GFocalV2/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/augment_wrappers.py | GFocalV2/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | GFocalV2/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | GFocalV2/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/setup.py | GFocalV2/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/faster_rcnn.py | GFocalV2/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/cascade_rcnn.py | GFocalV2/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/models/dense_heads/gfocal_head.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开发引入 | / | GFocalV2/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | GFocalV2/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | GFocalV2/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | GFocalV2/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/iou_loss.py | GFocalV2/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | GFocalV2/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/utils/util_mixins.py | GFocalV2/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GFocalV2/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | GFocalV2/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开发引入 | / | GFocalV2/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | GFocalV2/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/make.bat | GFocalV2/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | GFocalV2/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | GFocalV2/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GFocalV2/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/yolo.py | GFocalV2/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | GFocalV2/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/retina_head.py | GFocalV2/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开发引入 | / | GFocalV2/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GFocalV2/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | GFocalV2/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | GFocalV2/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/instaboost/README.md | GFocalV2/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | GFocalV2/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/utils/gaussian_target.py | GFocalV2/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | GFocalV2/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | GFocalV2/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GFocalV2/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GFocalV2/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GFocalV2/configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GFocalV2/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | mmcv地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GFocalV2/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/GFocalV2/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/M2Det/public_address_statement.md b/PyTorch/contrib/cv/detection/M2Det/public_address_statement.md index 766484af38bd4e78cf2ac674c7781e220bce3e1f..00cbbbc8d614193914a9eaa18801c49e320f38d5 100644 --- a/PyTorch/contrib/cv/detection/M2Det/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/M2Det/public_address_statement.md @@ -1,26 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/senet154-c7b49a05.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet50-ce0d4300.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet101-7e38fcc6.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnet152-d17c99b7.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det | M2Det/layers/senet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载预训练权重 | -| 开发引入 | / | M2Det/layers/modules/multibox_loss.py | https://arxiv.org/pdf/1512.02325.pdf | 论文地址 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/logger.py | M2Det/logger.py | https://gist.github.com/gyglim/1f8dfb1b5c82627ae3efcfbbadb9f514 | 源码实现 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/data/voc0712.py | M2Det/data/coco.py | https://github.com/fmassa/vision/blob/voc_dataset/torchvision/datasets/voc.py | 源码实现 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/data/voc0712.py | M2Det/data/voc0712.py | https://github.com/pytorch/vision/issues/9 | 相关说明 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/utils/box_utils.py | M2Det/utils/box_utils.py | https://github.com/fmassa/object-detection.torch | 源码实现 | -| 开发引入 | / | M2Det/finetune.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/README.md | M2Det/m2det.py | zhaoqijie@pku.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/utils/build.py | M2Det/utils/build.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 源码实现 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/utils/box_utils.py | M2Det/utils/box_utils.py | https://github.com/Hakuyume/chainer-ssd | 源码实现 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/data/voc0712.py | M2Det/data/voc0712.py | https://github.com/fmassa/vision/blob/voc_dataset/torchvision/datasets/voc.py | 源码实现 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/layers/senet.py | M2Det/layers/senet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开发引入 | / | M2Det/train_8p.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/qijiezhao/M2Det/data/data_augment.py | M2Det/data/data_augment.py | http://arxiv.org/abs/1512.02325 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/M2Det/layers/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/NasFPN/public_address_statement.md b/PyTorch/contrib/cv/detection/NasFPN/public_address_statement.md index de960a222650aabed492ad950541f95742c318a6..d14fed3d0746d816020145e65f35d699f304f7a6 100644 --- a/PyTorch/contrib/cv/detection/NasFPN/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/NasFPN/public_address_statement.md @@ -1,157 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|------------------------| ------------------------------------ |--| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docker/Dockerfile | NasFPN/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docker/Dockerfile | NasFPN/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/yolo.py | NasFPN/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | NasFPN/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | NasFPN/mmdet/models/dense_heads/transformer_head.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开发引入 | / | NasFPN/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | NasFPN/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | NasFPN/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | NasFPN/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/mask_rcnn/metafile.yml | NasFPN/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/guided_anchoring/README.md | NasFPN/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | NasFPN/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/sampling_result.py | NasFPN/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | NasFPN/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | NasFPN/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | NasFPN/mmdet/models/utils/transformer.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开发引入 | / | NasFPN/docs/stat.py | https://github.com/open-mmlab/mmdetection/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/roi_heads/mask_heads/global_context_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/centripetal_head.py | NasFPN/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/reppoints_detector.py | NasFPN/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/ghm_loss.py | NasFPN/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | NasFPN/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | NasFPN/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | NasFPN/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | NasFPN/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nas_fpn.py | NasFPN/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/libra_rcnn/README.md | NasFPN/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | NasFPN/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | NasFPN/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/wider_face.py | NasFPN/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/gfocal_loss.py | NasFPN/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/point_rend_roi_head.py | NasFPN/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | NasFPN/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/free_anchor_retina_head.py | NasFPN/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | NasFPN/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | NasFPN/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | NasFPN/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/dynamic_roi_head.py | NasFPN/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | NasFPN/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/ohem_sampler.py | NasFPN/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/detectors/scnet.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/assigners/atss_assigner.py | NasFPN/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | NasFPN/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | NasFPN/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/htc_roi_head.py | NasFPN/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | NasFPN/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/cascade_rpn_head.py | NasFPN/mmdet/models/dense_heads/cascade_rpn_head.py | https://arxiv.org/abs/1909.06720 | 论文地址 | -| 开发引入 | / | NasFPN/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | NasFPN/mmdet/models/detectors/sparse_rcnn.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | NasFPN/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/conf.py | NasFPN/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/utils/res_layer.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | NasFPN/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/roi_heads/mask_heads/scnet_mask_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | NasFPN/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | NasFPN/mmdet/models/roi_heads/bbox_heads/dii_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | NasFPN/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | NasFPN/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | NasFPN/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/bbox_nms.py | NasFPN/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | NasFPN/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/augment_wrappers.py | NasFPN/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/hrfpn.py | NasFPN/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | NasFPN/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/faster_rcnn.py | NasFPN/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/pafpn.py | NasFPN/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | NasFPN/mmdet/models/utils/positional_encoding.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/cascade_roi_head.py | NasFPN/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/sabl/README.md | NasFPN/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | NasFPN/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/roi_heads/mask_heads/scnet_semantic_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/double_roi_head.py | NasFPN/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/roi_heads/mask_heads/feature_relay_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/iou_loss.py | NasFPN/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/trident_faster_rcnn.py | NasFPN/mmdet/models/detectors/trident_faster_rcnn.py | https://arxiv.org/abs/1901.01892 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/get_started.md | NasFPN/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/corner_head.py | NasFPN/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | NasFPN/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/setup.py | NasFPN/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | NasFPN/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | NasFPN/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | NasFPN/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | NasFPN/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | NasFPN/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fast_rcnn.py | NasFPN/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/instaboost.py | NasFPN/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/hourglass.py | NasFPN/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/transforms/transforms.py | NasFPN/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/structures/mask/structures.py | NasFPN/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fovea.py | NasFPN/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/instaboost/README.md | NasFPN/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/nasfcos.py | NasFPN/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/cascade_rcnn.py | NasFPN/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | NasFPN/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | NasFPN/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_scoring_roi_head.py | NasFPN/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/resnet.py | NasFPN/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/roi_heads/scnet_roi_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/coders/tblr_bbox_coder.py | NasFPN/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/setup.py | NasFPN/setup.py | openmmlab@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/task_modules/samplers/score_hlr_sampler.py | NasFPN/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/RF100-Benchmark/coco_metric.py | NasFPN/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/datasets/lvis.py | NasFPN/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 图片地址 | -| 开发引入 | / | NasFPN/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/utils/gaussian_target.py | NasFPN/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | NasFPN/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | NasFPN/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | NasFPN/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/zh_cn/make.bat | NasFPN/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | NasFPN/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | NasFPN/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | NasFPN/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/retina_head.py | NasFPN/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开发引入 | / | NasFPN/tests/async_benchmark.py | http://download.openmmlab.com/mmdetection/v2.0 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/backbones/regnet.py | NasFPN/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | NasFPN/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/focal_loss.py | NasFPN/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/paa.py | NasFPN/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/mask_rcnn.py | NasFPN/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/ssd_head.py | NasFPN/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 相关说明 | -| 开发引入 | / | NasFPN/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/mask_heads/grid_head.py | NasFPN/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/scnet_roi_head.py | NasFPN/mmdet/models/roi_heads/bbox_heads/scnet_bbox_head.py | https://arxiv.org/abs/2012.10150 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/fcos.py | NasFPN/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/nasfcos_fpn.py | NasFPN/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/detectors/detr.py | NasFPN/mmdet/models/detectors/detr.py | https://arxiv.org/pdf/2005.12872 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/balanced_l1_loss.py | NasFPN/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/necks/rfp.py | NasFPN/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | NasFPN/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | NasFPN/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/docs/en/notes/changelog_v2.x.md | NasFPN/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/setup.py | NasFPN/setup.py | https://github.com/open-mmlab/mmdetection | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/utils/util_mixins.py | NasFPN/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/Detic_new/detic/heatmap_focal_loss.py | NasFPN/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/nasfcos_head.py | NasFPN/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/EfficientDet/efficientdet/tensorflow/yxyx_bbox_coder.py | NasFPN/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/configs/faster_rcnn/metafile.yml | NasFPN/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | NasFPN/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/layers/transformer/utils.py | NasFPN/mmdet/models/utils/transformer.py | https://github.com/PeizeSun/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/iou_loss.py | NasFPN/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/projects/SparseInst/sparseinst/sparseinst.py | NasFPN/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | NasFPN/mmdet/models/dense_heads/embedding_rpn_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/dense_heads/paa_head.py | NasFPN/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/losses/varifocal_loss.py | NasFPN/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/mmdet/models/roi_heads/sparse_roi_head.py | NasFPN/mmdet/models/roi_heads/sparse_roi_head.py | https://arxiv.org/abs/2011.12450 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | mmcv地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/NasFPN/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/RCF/public_address_statement.md b/PyTorch/contrib/cv/detection/RCF/public_address_statement.md index 0029258e21d12528219138b255d14d4c876bd1fc..ed76059fcb6cb7a45af99775e3f078f9de950724 100644 --- a/PyTorch/contrib/cv/detection/RCF/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/RCF/public_address_statement.md @@ -1,35 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|------------------------| ------------------------------------ |--| -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 预训练模型 | -| 开发引入 | / | RCF/implOrg/edges_eval_dir.py | https://github.com/pdollar/edges/blob/master/edgesEvalDir.m | 源码实现 | -| 开发引入 | / | RCF/cxx/src/Exception.hh | http://www.gnu.org/copyleft/gpl.html | 相关说明 | -| 开发引入 | / | RCF/cxx/src/Random.hh | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | RCF/cxx/src/String.cc | http://www.gnu.org/copyleft/gpl.html | 相关说明 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 预训练模型 | -| 开发引入 | / | RCF/cxx/src/Exception.hh | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开发引入 | / | RCF/cxx/src/Random.hh | http://www.gnu.org/copyleft/gpl.html | 相关说明 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开发引入 | / | RCF/implOrg/edges_eval_dir.py | https://github.com/pdollar/edges/blob/master/edgesEvalImg.m | 源码实现 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://github.com/pytorch/vision/tree/master/torchvision | 源码实现 | -| 开发引入 | / | RCF/implOrg/bwmorph_thin.py | https://gist.github.com/joefutrelle/562f25bbcf20691217b8 | 相关说明 | -| 开发引入 | / | RCF/cxx/src/Random.cc | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开发引入 | / | RCF/implOrg/correspond_pixels.py | https://github.com/davidstutz/extended-berkeley-segmentation-benchmark | 源码实现 | -| 开发引入 | / | RCF/cxx/src/String.hh | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开发引入 | / | RCF/implOrg/toolbox.py | https://github.com/pdollar/toolbox/blob/master/channels/gradient2.m | 源码实现 | -| 开发引入 | / | RCF/cxx/src/String.cc | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开发引入 | / | RCF/implOrg/correspond_pixels.py | https://github.com/davidstutz/extended-berkeley-segmentation-benchmark/blob/master/source/match.cc | 源码实现 | -| 开发引入 | / | RCF/implOrg/edges_eval_plot.py | https://github.com/pdollar/edges/blob/master/edgesEvalPlot.m | 源码实现 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 预训练模型 | -| 开发引入 | / | RCF/implOrg/toolbox.py | https://github.com/pdollar/toolbox/blob/master/channels/convTri.m | 源码实现 | -| 开发引入 | / | RCF/cxx/src/Exception.cc | http://www.gnu.org/copyleft/gpl.html | 相关说明 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/models.py | RCF/models.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 预训练模型 | -| 开发引入 | / | RCF/cxx/src/Exception.cc | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开发引入 | / | RCF/cxx/src/String.hh | http://www.gnu.org/copyleft/gpl.html | 相关说明 | -| 开发引入 | / | RCF/cxx/src/Random.cc | http://www.gnu.org/copyleft/gpl.html | 相关说明 | -| 开源代码引入 | https://github.com/mayorx/rcf-edge-detection/README.md | RCF/data_loader.py | https://github.com/meteorshowers/RCF-pytorch | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RCF/models.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RCF/models.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RCF/models.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RCF/models.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RCF/models.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/RefineDet/public_address_statement.md b/PyTorch/contrib/cv/detection/RefineDet/public_address_statement.md index 5eed193e56a469e9215d6b7044355ec94f847a1c..c03654a0c32264978f7f7dcafd6677f283d02be4 100644 --- a/PyTorch/contrib/cv/detection/RefineDet/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/RefineDet/public_address_statement.md @@ -1,23 +1,10 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|------------------------| ------------------------------------ |--| -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/COCO2014.sh | http://images.cocodataset.org/zips/train2014.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/COCO2014.sh | http://images.cocodataset.org/zips/val2014.zip | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/COCO2014.sh | http://images.cocodataset.org/annotations/annotations_trainval2014.zip | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/COCO2014.sh | https://s3.amazonaws.com/amdegroot-datasets/instances_trainval35k.json.zip | 下载依赖 | -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/VOC2007.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/VOC2007.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/pytorch/vision | RefineDet/data/scripts/VOC2012.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | -| 开发引入 | / | RefineDet/url.ini | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载预训练权重 | -| 开发引入 | / | RefineDet/models/refinedet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/vgg.py | 源码实现 | -| 开发引入 | / | RefineDet/data/coco.py | http://mscoco.org/dataset/#detections-challenge2016 | 数据集地址 | -| 开发引入 | / | RefineDet/data/voc0712.py | https://github.com/fmassa/vision/blob/voc_dataset/torchvision/datasets/voc.py | 源码实现 | -| 开发引入 | / | RefineDet/layers/box_utils.py | https://github.com/Hakuyume/chainer-ssd | 源码实现 | -| 开发引入 | / | RefineDet/models/refinedet.py | https://arxiv.org/pdf/1512.02325.pdf | 论文地址 | -| 开发引入 | / | RefineDet/layers/modules/refinedet_multibox_loss.py | https://arxiv.org/pdf/1512.02325.pdf | 论文地址 | -| 开发引入 | / | RefineDet/layers/functions/detection_refinedet.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开发引入 | / | RefineDet/eval_refinedet.py | https://github.com/longcw/faster_rcnn_pytorch | 源码实现 | -| 开发引入 | / | RefineDet/layers/modules/multibox_loss.py | https://arxiv.org/pdf/1512.02325.pdf | 论文地址 | -| 开发引入 | / | RefineDet/layers/box_utils.py | https://github.com/fmassa/object-detection.torch | 源码实现 | -| 开发引入 | / | RefineDet/train_1p.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py | 源码实现 | -| 开发引入 | / | RefineDet/eval_refinedet.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开发引入 | / | RefineDet/train_8p.py | https://github.com/pytorch/examples/blob/master/imagenet/main.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------|----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/COCO2014.sh | http://images.cocodataset.org/zips/train2014.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/COCO2014.sh | http://images.cocodataset.org/zips/val2014.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/COCO2014.sh | http://images.cocodataset.org/annotations/annotations_trainval2014.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/COCO2014.sh | https://s3.amazonaws.com/amdegroot-datasets/instances_trainval35k.json.zip | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/VOC2007.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/VOC2007.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/data/scripts/VOC2012.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RefineDet/url.ini | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/RetinaNet/public_address_statement.md b/PyTorch/contrib/cv/detection/RetinaNet/public_address_statement.md index b0503bd0a762e1de197415a6666e493065cb24ae..66ecac71155e34088cc7fa47030a429848907c71 100644 --- a/PyTorch/contrib/cv/detection/RetinaNet/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/RetinaNet/public_address_statement.md @@ -1,111 +1,14 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docker/Dockerfile | https://github.com/facebookresearch/fvcore | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docker/Dockerfile | https://github.com/facebookresearch/detectron2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docs/conf.py | https://github.com/facebookresearch/detectron2/blob/master/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docs/conf.py | https://docs.python.org/3.6 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docs/conf.py | https://docs.scipy.org/doc/numpy/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docs/conf.py | https://pytorch.org/docs/master/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/docs/conf.py | https://arxiv.org/abs/ | 下载依赖 | -| 开发引入 | / | RefineDet/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载图片 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/setup.py | https://github.com/facebookresearch/detectron2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/setup.py | https://github.com/psf/black@673327449f86fce558adde153bb6cbe54bfebad2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg| 下载图片 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/meta_arch/retinanet.py | RetinaNet/tests/test_model_zoo.py |https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载预训练权重 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/docs/notes/compatibility.md | RetinaNet/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/docs/conf.py | RetinaNet/docs/conf.py | https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/tests/layers/test_roi_align.py | RetinaNet/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/31734 | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/datasets/README.md | RetinaNet/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/datasets/coco.py | RetinaNet/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | RetinaNet/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/export/shared.py | RetinaNet/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关依赖 | -| 开发引入 | / | RetinaNet/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/evaluation/pascal_voc_evaluation.py | RetinaNet/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | RetinaNet/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/utils/env.py | RetinaNet/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关依赖 | -| 开发引入 | / | RetinaNet/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/39308 | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/vision.cpp | RetinaNet/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/engine/train_loop.py | RetinaNet/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 相关依赖 | -| 开发引入 | / | RetinaNet/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开发引入 | / | RetinaNet/tests/structures/test_boxes.py | https://github.com/pytorch/pytorch/pull/39336 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/evaluation/sem_seg_evaluation.py | RetinaNet/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 数据集地址 | -| 开发引入 | / | RetinaNet/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/pull/41371 | 源码实现 | -| 开发引入 | / | RetinaNet/docker/Dockerfile | http://images.cocodataset.org/val2017/000000439715.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/transforms/transform.py | RetinaNet/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/transforms/transform.py | RetinaNet/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关说明 | -| 开发引入 | / | RetinaNet/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/pull/38378 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/engine/defaults.py | RetinaNet/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/batch_norm.py | RetinaNet/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/detection_utils.py | RetinaNet/detectron2/data/detection_utils.py | https://www.exiv2.org/tags.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/utils/serialize.py | RetinaNet/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/structures/masks.py | RetinaNet/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/transforms/transform.py | RetinaNet/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关依赖 | -| 开发引入 | / | RetinaNet/detectron2/export/torchscript.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/engine/launch.py | RetinaNet/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/tests/modeling/test_matcher.py | RetinaNet/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/issues/38964 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/docs/tutorials/datasets.md | RetinaNet/detectron2/config/defaults.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | RetinaNet/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/projects/DensePose/densepose/data/datasets/lvis.py | RetinaNet/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | RetinaNet/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/projects/DensePose/densepose/data/datasets/lvis.py | RetinaNet/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/projects/DensePose/densepose/data/datasets/lvis.py | RetinaNet/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/evaluation/sem_seg_evaluation.py | RetinaNet/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 数据集地址 | -| 开发引入 | / | RetinaNet/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/datasets/coco.py | RetinaNet/detectron2/data/datasets/register_coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/config/defaults.py | RetinaNet/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/datasets/coco.py | RetinaNet/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/datasets/builtin_meta.py | RetinaNet/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/structures/boxes.py | RetinaNet/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/layers/nms.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/docs/conf.py | RetinaNet/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关配置 | -| 开发引入 | / | RetinaNet/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/config/defaults.py | RetinaNet/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/engine/launch.py | RetinaNet/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/dev/packaging/build_wheel.sh | RetinaNet/dev/packaging/build_wheel.sh | https://github.com/NVIDIA/nvidia-docker/issues/854 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/poolers.py | RetinaNet/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | RetinaNet/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/structures/boxes.py | RetinaNet/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/tests/modeling/test_matcher.py | RetinaNet/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开发引入 | / | RetinaNet/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg","http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg","http://images.cocodataset.org/val2017/000000000139.jpg","http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/detection_utils.py | RetinaNet/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关说明 | -| 开发引入 | / | RetinaNet/detectron2/modeling/meta_arch/retinanet.py.bakkk | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/detection_utils.py | RetinaNet/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/proposal_generator/rrpn.py | RetinaNet/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/utils/logger.py | RetinaNet/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/transforms/augmentation_impl.py | RetinaNet/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/batch_norm.py | RetinaNet/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/docs/tutorials/datasets.md | RetinaNet/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/datasets/prepare_cocofied_lvis.py | RetinaNet/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/utils/visualizer.py | RetinaNet/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/detection_utils.py | RetinaNet/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开发引入 | / | RetinaNet/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/detection_utils.py | RetinaNet/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/csrc/vision.cpp | RetinaNet/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/modeling/meta_arch/retinanet.py.bk | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/wrappers.py | RetinaNet/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/data/datasets/coco.py | RetinaNet/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/modeling/meta_arch/retinanet.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开发引入 | / | RetinaNet/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/wrappers.py | RetinaNet/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/RetinaNet/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SOLOv1/public_address_statement.md b/PyTorch/contrib/cv/detection/SOLOv1/public_address_statement.md index 5d3c3d132a1914288ec5bda741f60a7c0354ffc0..c25480a8de79e9c8f7b8c954689ab120f3d4137e 100644 --- a/PyTorch/contrib/cv/detection/SOLOv1/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/SOLOv1/public_address_statement.md @@ -1,91 +1,27 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|---------------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/.travis.yml | http://developer.download.nvidia.com/compute/cuda/repos/ | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/.travis.yml | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/.travis.yml | https://developer.download.nvidia.com/compute/cuda/repos/ | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/docker/Dockerfile | https://github.com:WXinlong/SOLO.git | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/.pre-commit-config.yaml | https://github.com/asottile/seed-isort-config | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/.pre-commit-config.yaml | https://github.com/pre-commit/mirrors-yapf | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/.pre-commit-config.yaml | https://github.com/pre-commit/pre-commit-hooks | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/setup.py | https://github.com/open-mmlab/mmcv | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/mmcv/setup.py | chenkaidev@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/setup.py | chenkaidev@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/setup.py | https://github.com/open-mmlab/mmdetection | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv1/tests/async_benchmark.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/mask_rcnn_r50_fpn_1x_20181010-069fa190.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/pipelines/transforms.py | SOLOv1/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开发引入 | / | SOLOv1/mmdet/models/plugins/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/ghm_loss.py | SOLOv1/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开发引入 | / | SOLOv1/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_pool_cuda.cpp | SOLOv1/mmdet/ops/dcn/src/deform_pool_cuda.cpp | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/libra_rcnn/README.md | SOLOv1/mmdet/models/necks/bfp.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/libra_rcnn/README.md | SOLOv1/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开发引入 | / | SOLOv1/mmdet/core/fp16/hooks.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/assigners/atss_assigner.py | SOLOv1/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/demodata.py | SOLOv1/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/mask_heads/grid_head.py | SOLOv1/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开发引入 | / | SOLOv1/mmcv/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/utils/src/compiling_info.cpp | SOLOv1/mmdet/ops/utils/src/compiling_info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | SOLOv1/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/csrc/cuda/SigmoidFocalLoss_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/detectors/mask_scoring_rcnn.py | SOLOv1/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/utils/src/compiling_info.cpp | SOLOv1/mmdet/ops/utils/src/compiling_info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_conv_cuda.cpp | SOLOv1/mmdet/ops/dcn/src/deform_conv_cuda.cpp | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/pipelines/instaboost.py | SOLOv1/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/demodata.py | SOLOv1/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/retina_head.py | SOLOv1/mmdet/models/anchor_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/detectors/grid_rcnn.py | SOLOv1/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/fcos_head.py | SOLOv1/mmdet/models/anchor_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | SOLOv1/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | cyfu@cs.unc.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/necks/hrfpn.py | SOLOv1/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/guided_anchoring/README.md | SOLOv1/mmdet/models/anchor_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/utils/util_mixins.py | SOLOv1/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/utils/src/compiling_info.cpp | SOLOv1/mmdet/ops/utils/src/compiling_info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_pool_cuda_kernel.cu | SOLOv1/mmdet/ops/dcn/src/deform_pool_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/cuda/deform_psroi_pooling_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | SOLOv1/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | https://github.com/pytorch/pytorch/blob/master/modules/detectron/sigmoid_focal_loss_op.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/wider_face.py | SOLOv1/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/mask_heads/grid_head.py | SOLOv1/mmdet/models/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开发引入 | / | SOLOv1/mmdet/models/plugins/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | SOLOv1/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/samplers/ohem_sampler.py | SOLOv1/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/pdf/1604.03540.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/libra_rcnn/README.md | SOLOv1/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/utils/flops_counter.py | SOLOv1/mmdet/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/loader/build_loader.py | SOLOv1/mmdet/datasets/loader/build_loader.py | https://github.com/pytorch/pytorch/issues/973 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/transforms.py | SOLOv1/mmdet/core/bbox/transforms.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/iou_loss.py | SOLOv1/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开发引入 | / | SOLOv1/mmcv/examples/resnet_cifar.py | https://github.com/kuangliu/pytorch-cifar/blob/master/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | SOLOv1/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/docs/make.bat | SOLOv1/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss.cpp | SOLOv1/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss.cpp | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/csrc/SigmoidFocalLoss.h | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/setup.py | SOLOv1/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_pool_cuda.cpp | SOLOv1/mmdet/ops/dcn/src/deform_pool_cuda.cpp | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/modulated_dcn_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/iou_loss.py | SOLOv1/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/detectors/reppoints_detector.py | SOLOv1/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/docs/make.bat | SOLOv1/mmcv/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开发引入 | / | SOLOv1/mmdet/core/evaluation/coco_utils.py | https://github.com/facebookresearch/detectron2/blob/03064eb5bafe4a3e5750cc7a16672daf5afe8435/detectron2/evaluation/coco_evaluation.py#L259-L283 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/tools/train.py | SOLOv1/tools/train.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/iou_loss.py | SOLOv1/mmdet/models/losses/iou_loss.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py#L36 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/necks/hrfpn.py | SOLOv1/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/necks/nas_fpn.py | SOLOv1/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/atss_head.py | SOLOv1/mmdet/models/anchor_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/docs/conf.py | SOLOv1/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/fovea_head.py | SOLOv1/mmdet/models/anchor_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/.travis.yml | http://developer.download.nvidia.com/compute/cuda/repos/${UBUNTU_VERSION}/x86_64/${INSTALLER} | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/.travis.yml | https://developer.download.nvidia.com/compute/cuda/repos/${UBUNTU_VERSION}/x86_64/7fa2af80.pub | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/vgg16_caffe-292e1171.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_caffe-788b5fa3.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_caffe-3ad79236.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d-a5af3160.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn-9186a21c.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn-cac0ab98.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w18-00eb2006.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/mmcv/setup.py | chenkaidev@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv1/setup.py | chenkaidev@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SOLOv2/public_address_statement.md b/PyTorch/contrib/cv/detection/SOLOv2/public_address_statement.md index b3642962e1dad695e4005afbc669bc3334b9caac..7913ccd83af643e862840cda5683a8ce7a740fbc 100644 --- a/PyTorch/contrib/cv/detection/SOLOv2/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/SOLOv2/public_address_statement.md @@ -1,91 +1,27 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|---------------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/.travis.yml | http://developer.download.nvidia.com/compute/cuda/repos/ | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/.travis.yml | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/.travis.yml | https://developer.download.nvidia.com/compute/cuda/repos/ | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/docker/Dockerfile | https://github.com:WXinlong/SOLO.git | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/.pre-commit-config.yaml | https://github.com/asottile/seed-isort-config | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/.pre-commit-config.yaml | https://github.com/pre-commit/mirrors-yapf | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/.pre-commit-config.yaml | https://github.com/pre-commit/pre-commit-hooks | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/setup.py | https://github.com/open-mmlab/mmcv | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/mmcv/setup.py | chenkaidev@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/setup.py | chenkaidev@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/setup.py | https://github.com/open-mmlab/mmdetection | 下载依赖 | -| 开源代码引入 | https://github.com/WXinlong/SOLO | SOLOv2/tests/async_benchmark.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/mask_rcnn_r50_fpn_1x_20181010-069fa190.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/retina_head.py | SOLOv2/mmdet/models/anchor_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | SOLOv2/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/csrc/cuda/SigmoidFocalLoss_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/detectors/grid_rcnn.py | SOLOv2/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/utils/flops_counter.py | SOLOv2/mmdet/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/mask_heads/grid_head.py | SOLOv2/mmdet/models/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/libra_rcnn/README.md | SOLOv2/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_pool_cuda_kernel.cu | SOLOv2/mmdet/ops/dcn/src/deform_pool_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/cuda/deform_psroi_pooling_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | SOLOv2/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | https://github.com/pytorch/pytorch/blob/master/modules/detectron/sigmoid_focal_loss_op.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/utils/util_mixins.py | SOLOv2/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开发引入 | / | SOLOv2/mmdet/core/fp16/hooks.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开发引入 | / | SOLOv2/mmcv/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/utils/src/compiling_info.cpp | SOLOv2/mmdet/ops/utils/src/compiling_info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/docs/make.bat | SOLOv2/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/utils/src/compiling_info.cpp | SOLOv2/mmdet/ops/utils/src/compiling_info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/necks/nas_fpn.py | SOLOv2/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/fcos_head.py | SOLOv2/mmdet/models/anchor_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/detectors/mask_scoring_rcnn.py | SOLOv2/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/tools/train.py | SOLOv2/tools/train.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/samplers/ohem_sampler.py | SOLOv2/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/pdf/1604.03540.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/demodata.py | SOLOv2/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/detectors/reppoints_detector.py | SOLOv2/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/assigners/atss_assigner.py | SOLOv2/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/loader/build_loader.py | SOLOv2/mmdet/datasets/loader/build_loader.py | https://github.com/pytorch/pytorch/issues/973 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/utils/src/compiling_info.cpp | SOLOv2/mmdet/ops/utils/src/compiling_info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | SOLOv2/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss_cuda.cu | cyfu@cs.unc.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/wider_face.py | SOLOv2/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/demodata.py | SOLOv2/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开发引入 | / | SOLOv2/mmdet/models/plugins/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | SOLOv2/mmdet/models/plugins/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/iou_loss.py | SOLOv2/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/ghm_loss.py | SOLOv2/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/iou_loss.py | SOLOv2/mmdet/models/losses/iou_loss.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py#L36 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/libra_rcnn/README.md | SOLOv2/mmdet/models/necks/bfp.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/mask_heads/grid_head.py | SOLOv2/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | SOLOv2/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/atss_head.py | SOLOv2/mmdet/models/anchor_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_pool_cuda.cpp | SOLOv2/mmdet/ops/dcn/src/deform_pool_cuda.cpp | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/modulated_dcn_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/pipelines/transforms.py | SOLOv2/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_conv_cuda.cpp | SOLOv2/mmdet/ops/dcn/src/deform_conv_cuda.cpp | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/necks/hrfpn.py | SOLOv2/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | SOLOv2/mmdet/core/evaluation/coco_utils.py | https://github.com/facebookresearch/detectron2/blob/03064eb5bafe4a3e5750cc7a16672daf5afe8435/detectron2/evaluation/coco_evaluation.py#L259-L283 | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/datasets/pipelines/instaboost.py | SOLOv2/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_pool_cuda.cpp | SOLOv2/mmdet/ops/dcn/src/deform_pool_cuda.cpp | https://github.com/torch/cunn/blob/master/lib/THCUNN/generic/SpatialConvolutionMM.cu | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/setup.py | SOLOv2/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关依赖 | -| 开发引入 | / | SOLOv2/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/libra_rcnn/README.md | SOLOv2/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/anchor_heads/fovea_head.py | SOLOv2/mmdet/models/anchor_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/docs/conf.py | SOLOv2/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss.cpp | SOLOv2/mmdet/ops/sigmoid_focal_loss/src/sigmoid_focal_loss.cpp | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/csrc/SigmoidFocalLoss.h | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/docs/make.bat | SOLOv2/mmcv/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/losses/iou_loss.py | SOLOv2/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/core/bbox/transforms.py | SOLOv2/mmdet/core/bbox/transforms.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开发引入 | / | SOLOv2/mmcv/examples/resnet_cifar.py | https://github.com/kuangliu/pytorch-cifar/blob/master/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | SOLOv2/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/configs/guided_anchoring/README.md | SOLOv2/mmdet/models/anchor_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开源代码引入 | https://github.com/WXinlong/SOLO/mmdet/models/necks/hrfpn.py | SOLOv2/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/.travis.yml | http://developer.download.nvidia.com/compute/cuda/repos/${UBUNTU_VERSION}/x86_64/${INSTALLER} | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/.travis.yml | https://developer.download.nvidia.com/compute/cuda/repos/${UBUNTU_VERSION}/x86_64/7fa2af80.pub | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/vgg16_caffe-292e1171.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_caffe-788b5fa3.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_caffe-3ad79236.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d-a5af3160.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn-9186a21c.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn-cac0ab98.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w18-00eb2006.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/mmcv/runner/checkpoint.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | mmlab模型url | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/mmcv/setup.py | chenkaidev@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SOLOv2/setup.py | chenkaidev@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SSD-MobileNetV1/public_address_statement.md b/PyTorch/contrib/cv/detection/SSD-MobileNetV1/public_address_statement.md index cdcf0c62ae10c120cdae5010bbc253fab47f3dc3..24e2768eb75e652aa6e4a9e20ed00034ac2d79f2 100644 --- a/PyTorch/contrib/cv/detection/SSD-MobileNetV1/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/SSD-MobileNetV1/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-------------------------| ------------------------------------ |------| -| 开发引入 | / | SSD-MobileNetV1/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载图片 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------|-----------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-MobileNetV1/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SSD-MobilenetV2/public_address_statement.md b/PyTorch/contrib/cv/detection/SSD-MobilenetV2/public_address_statement.md index 5179389c0b4e37b0640cc15f1bea88682bb0adef..b7aa7cf4e6246c81b8e5ce89636c5bbeba0bd194 100644 --- a/PyTorch/contrib/cv/detection/SSD-MobilenetV2/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/SSD-MobilenetV2/public_address_statement.md @@ -1,18 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|-----------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git | SSD-MobilenetV2/open_images_downloader.py | https://storage.googleapis.com/openimages/2018_04/class-descriptions-boxable.csv | 下载依赖 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git | SSD-MobilenetV2/open_images_downloader.py | https://storage.googleapis.com/openimages/2018_04/ | 下载依赖 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git | SSD-MobilenetV2/vision/nn/alexnet.py | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git | SSD-MobilenetV2/vision/nn/squeezenet.py | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git | SSD-MobilenetV2/vision/nn/squeezenet.py | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 下载预训练权重 | -| 开发引入 | / | SSD-MobilenetV2/vision/nn/squeezenet.py | https://arxiv.org/abs/1602.07360 | 论文地址 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/utils/box_utils_numpy.py | SSD-MobilenetV2/vision/utils/box_utils.py | https://arxiv.org/abs/1704.04503 | 论文地址 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/nn/mobilenet_v2.py | SSD-MobilenetV2/vision/nn/mobilenet_v2.py | https://github.com/tonylins/pytorch-mobilenet-v2/blob/master/MobileNetV2.py | 源码实现 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/nn/alexnet.py | SSD-MobilenetV2/vision/nn/alexnet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/alexnet.py | 源码实现 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/nn/mobilenet.py | SSD-MobilenetV2/vision/nn/mobilenet.py | https://github.com/marvis/pytorch-mobilenet | 源码实现 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/prunning/prunner.py | SSD-MobilenetV2/vision/prunning/prunner.py | https://arxiv.org/pdf/1611.06440.pdf | 论文地址 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/nn/squeezenet.py | SSD-MobilenetV2/vision/nn/squeezenet.py | https://github.com/DeepScale/SqueezeNet/tree/master/SqueezeNet_v1.1 | 源码实现 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/utils/box_utils_numpy.py | SSD-MobilenetV2/vision/utils/box_utils.py | https://github.com/facebookresearch/Detectron/blob/master/detectron/utils/cython_nms.pyx | 源码实现 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/nn/vgg.py | SSD-MobilenetV2/vision/nn/vgg.py | https://github.com/amdegroot/ssd.pytorch/blob/master/ssd.py | 源码实现 | -| 开源代码引入 | https://github.com/qfgaohao/pytorch-ssd.git/vision/transforms/transforms.py | SSD-MobilenetV2/vision/transforms/transforms.py | https://github.com/amdegroot/ssd.pytorch | 源码实现 | -| 开发引入 | / | SSD-MobilenetV2/vision/nn/alexnet.py | https://arxiv.org/abs/1404.5997 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-MobilenetV2/open_images_downloader.py | https://storage.googleapis.com/openimages/2018_04/class-descriptions-boxable.csv | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-MobilenetV2/open_images_downloader.py | https://storage.googleapis.com/openimages/2018_04/{dataset_type}/{dataset_type}-annotations-bbox.csv | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-MobilenetV2/vision/nn/alexnet.py | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-MobilenetV2/vision/nn/squeezenet.py | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-MobilenetV2/vision/nn/squeezenet.py | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SSD-Resnet/public_address_statement.md b/PyTorch/contrib/cv/detection/SSD-Resnet/public_address_statement.md index cb9b0984d978acd104ee1d94329921383eb0ea84..7c2d31e4b92330f4f84a6c7212985be89844d177 100644 --- a/PyTorch/contrib/cv/detection/SSD-Resnet/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/SSD-Resnet/public_address_statement.md @@ -1,34 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|--------------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/bz2.py | nadeem.vawda@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/Dockerfile | https://github.com/mlperf/logging/archive/9ea0afa.zip | 下载依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/download_dataset.sh | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/download_dataset.sh | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/download_dataset.sh | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/mlperf_logging/result_summarizer/result_summarizer.py | https://github.com/mlperf/training_results_v | 下载依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/mlperf_logging/system_desc_checker/system_desc_checker.py | https://github.com/mlperf/training_results_v | 下载依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch | SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/test.py | SSD-Resnet/box_coder.py | https://github.com/amdegroot/ssd.pytorch/blob/master/data/config.py | 源码实现 | -| 开发引入 | / | SSD-Resnet/box_coder.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/utils.py | SSD-Resnet/utils.py | https://github.com/kuangliu/pytorch-ssd | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/utils.py | SSD-Resnet/utils.py | https://discuss.pytorch.org/t/how-to-preprocess-input-for-pre-trained-networks/683 | 预训练模型 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/parse_config.py | SSD-Resnet/parse_config.py | https://stackoverflow.com/a/31347222/2209313 | 相关说明 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/fused_color_jitter.py | SSD-Resnet/fused_color_jitter.py | https://pytorch.org/docs/stable/torchvision/transforms.html#torchvision.transforms.ColorJitter | 相关依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/train.py | SSD-Resnet/demo.py | https://github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/opt_loss.py | SSD-Resnet/base_model.py | http://jany.st/post/2017-11-05-single-shot-detector-ssd-from-scratch-in-tensorflow.html | 相关依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/train.py | SSD-Resnet/eval.py | https://github.com/nvidia/apex | 相关依赖 | -| 开发引入 | / | SSD-Resnet/nms.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/core/post_processing/bbox_nms.py#L7 | 源码实现 | -| 开发引入 | / | SSD-Resnet/Dockerfile | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/utils.py | SSD-Resnet/box_coder.py | https://github.com/kuangliu/pytorch-ssd | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/train.py | SSD-Resnet/train.py | https://github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/box_coder.py | SSD-Resnet/box_coder.py | https://github.com/weiliu89/caffe | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/box_coder.py | SSD-Resnet/box_coder.py | https://github.com/amdegroot/ssd.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/train.py | SSD-Resnet/eval8p.py | https://github.com/nvidia/apex | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/csrc/nhwc/ParamsHash.h | SSD-Resnet/csrc/nhwc/ParamsHash.h | https://en.wikipedia.org/wiki/Fowler%E2%80%93Noll%E2%80%93Vo_hash_function | 相关说明 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/utils.py | SSD-Resnet/utils.py | https://github.com/chauhan-utk/ssd.DomainAdaptation | 源码实现 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/opt_loss.py | SSD-Resnet/opt_loss.py | http://jany.st/post/2017-11-05-single-shot-detector-ssd-from-scratch-in-tensorflow.html | 相关依赖 | -| 开源代码引入 | https://github.com/mlcommons/training_results_v0.7/tree/master/NVIDIA/benchmarks/ssd/implementations/pytorch/csrc/nhwc/conv.cpp | SSD-Resnet/csrc/nhwc/conv.cpp | https://blog.yani.io/filter-group-tutorial/ | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------|------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/bz2.py | nadeem.vawda@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/download_dataset.sh | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/download_dataset.sh | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/download_dataset.sh | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD-Resnet/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SSD/public_address_statement.md b/PyTorch/contrib/cv/detection/SSD/public_address_statement.md index 9d1e44760863f5efe7a968a643d18ac653bb8825..715a5495bc1d74d8aa5b28dcbbb6241be6b50f22 100644 --- a/PyTorch/contrib/cv/detection/SSD/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/SSD/public_address_statement.md @@ -1,141 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------|-------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/.pre-commit-config.yaml | https://gitlab.com/pycqa/flake8.git | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/.pre-commit-config.yaml | https://github.com/asottile/seed-isort-config | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/.pre-commit-config.yaml | https://github.com/timothycrosley/isort | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/.pre-commit-config.yaml | https://github.com/pre-commit/mirrors-yapf | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/.pre-commit-config.yaml | https://github.com/pre-commit/pre-commit-hooks | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/.pre-commit-config.yaml | https://github.com/myint/docformatter | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/docker/Dockerfile | https://github.com/open-mmlab/mmdetection.git | 下载依赖 | -| 开发引入 | / | SSD/Dockerfile | https://github.com/open-mmlab/mmcv.git | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/setup.py | openmmlab@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/setup.py | https://github.com/open-mmlab/mmdetection | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd | SSD/tests/async_benchmark.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/mask_rcnn_r50_fpn_1x_20181010-069fa190.pth | 下载预训练权重 | -| 开发引入 | / | SSD/mmdet/models/detectors/point_rend.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/backbones/detectors_resnet.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwimage/blob/master/kwimage/structs/boxes.py#L1390 | 源码实现 | -| 开发引入 | / | SSD/mmdet/core/bbox/coder/yolo_bbox_coder.py | https://arxiv.org/abs/1506.02640 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/necks/bfp.py | https://arxiv.org/abs/1904.02701 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1711.00164 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/tree/master/configs/ssd/README.md | SSD/mmdet/models/dense_heads/ssd_head.py | https://arxiv.org/abs/1512.02325 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/coder/delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/faster_rcnn.py | https://arxiv.org/abs/1506.01497 | 论文地址 | -| 开发引入 | / | SSD/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 预训练模型 | -| 开发引入 | / | SSD/mmdet/models/detectors/fovea.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/cascade_rcnn.py | https://arxiv.org/abs/1906.09756 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/backbones/regnet.py | https://arxiv.org/abs/2003.13678 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/nasfcos_head.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/fovea_head.py | https://arxiv.org/abs/1904.03797 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/post_processing/bbox_nms.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/detectors/cornernet.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1911.08287 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/necks/rfp.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/guided_anchor_head.py | https://arxiv.org/abs/1901.03278 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/demodata.py | https://gitlab.kitware.com/computer-vision/kwarray/blob/master/kwarray/util_random.py#L270 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/mask_scoring_roi_head.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/samplers/iou_balanced_neg_sampler.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/varifocal_loss.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/centripetal_head.py | https://arxiv.org/abs/2003.09119 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/retina_head.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开发引入 | / | SSD/configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth | 预训练模型 | -| 开发引入 | / | SSD/mmdet/models/detectors/yolact.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/vfnet_head.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/corner_head.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/corner_head.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/ | 源码实现 | -| 开发引入 | / | SSD/mmdet/datasets/pipelines/instaboost.py | https://github.com/GothicAi/Instaboost | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/losses/ae_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L180 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/point_rend_roi_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/detectors/mask_scoring_rcnn.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/free_anchor_retina_head.py | https://arxiv.org/abs/1909.02466 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/gfl_head.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/samplers/ohem_sampler.py | https://arxiv.org/abs/1604.03540 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/dynamic_roi_head.py | https://arxiv.org/abs/2004.06002 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/grid_roi_head.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/fsaf.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/fast_rcnn.py | https://arxiv.org/abs/1504.08083 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/paa.py | https://arxiv.org/pdf/2007.08103.pdf | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/coder/bucketing_bbox_coder.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/pipelines/transforms.py | https://github.com/bethgelab/imagecorruptions | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/necks/fpn_carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/cascade_roi_head.py | https://arxiv.org/abs/1712.00726 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/yolact_head.py | https://arxiv.org/abs/1904.02689 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/pipelines/instaboost.py | https://arxiv.org/abs/1908.07801 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/iou_loss.py | https://arxiv.org/abs/2005.03572 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/necks/pafpn.py | https://arxiv.org/abs/1803.01534 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/vfnet.py | https://arxiv.org/abs/2008.13367 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/grid_rcnn.py | https://arxiv.org/abs/1811.12030 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/reppoints_detector.py | https://arxiv.org/pdf/1904.11490 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/atss.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开发引入 | / | SSD/mmdet/utils/util_mixins.py | https://github.com/Erotemic/ubelt | 源码实现 | -| 开发引入 | / | SSD/mmdet/datasets/pipelines/transforms.py | https://albumentations.readthedocs.io | 相关说明 | -| 开发引入 | / | SSD/mmdet/models/losses/gaussian_focal_loss.py | https://github.com/princeton-vl/CornerNet/blob/master/models/py_utils/kp_utils.py#L152 | 源码实现 | -| 开发引入 | / | SSD/mmdet/core/bbox/coder/legacy_delta_xywh_bbox_coder.py | https://arxiv.org/abs/1311.2524 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/lvis.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/utils/gaussian_target.py | https://github.com/princeton-vl/CornerNet-Lite/blob/master/core/sample/ | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/pisa_roi_head.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/fcos.py | https://arxiv.org/abs/1904.01355 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/coder/tblr_bbox_coder.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开发引入 | / | SSD/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/mask_heads/mask_point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/necks/nas_fpn.py | https://arxiv.org/abs/1904.07392 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/lvis.py | http://images.cocodataset.org/train2017/000000391895.jpg | 数据集地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/nasfcos.py | https://arxiv.org/abs/1906.0442 | 论文地址 | -| 开发引入 | / | SSD/test/train_performance_multinodes.sh | port=23333 | 相关说明 | -| 开发引入 | / | SSD/mmdet/models/detectors/retinanet.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/fsaf_head.py | https://arxiv.org/abs/1903.00621 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/backbones/hourglass.py | https://arxiv.org/abs/1603.06937 | 论文地址 | -| 开发引入 | / | SSD/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 预训练模型 | -| 开发引入 | / | SSD/mmdet/models/losses/ghm_loss.py | https://arxiv.org/abs/1811.05181 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/mask_rcnn.py | https://arxiv.org/abs/1703.06870 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/cityscapes.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/data/datasets/cityscapes.py | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/9 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/roi_extractors/single_level_roi_extractor.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/CIoU | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/losses/gfocal_loss.py | https://arxiv.org/abs/2006.04388 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/yolo_head.py | https://arxiv.org/abs/1804.02767 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/mask_heads/grid_head.py | https://arxiv.org/abs/1906.05688 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/dataset_wrappers.py | https://github.com/facebookresearch/detectron2/blob/41d475b75a230221e21d9cac5d69655e3415e3a4/detectron2/data/samplers/distributed_sampler.py#L57 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/point_rend_roi_head.py | https://arxiv.org/abs/1912.08193 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/necks/hrfpn.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/pipelines/auto_augment.py | https://arxiv.org/pdf/1906.11172 | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/coco.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/htc_roi_head.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/issues/8 | 源码实现 | -| 开发引入 | / | SSD/mmdet/datasets/wider_face.py | https://github.com/sovrasov/wider-face-pascal-voc-annotations | 源码实现 | -| 开发引入 | / | SSD/mmdet/core/bbox/assigners/atss_assigner.py | https://github.com/sfzhang15/ATSS/blob/master/atss_core/modeling/rpn/atss/loss.py | 源码实现 | -| 开发引入 | / | SSD/mmdet/core/mask/structures.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开发引入 | / | SSD/mmdet/models/losses/iou_loss.py | https://github.com/Zzh-tju/DIoU | 源码实现 | -| 开发引入 | / | SSD/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关依赖 | -| 开发引入 | / | SSD/mmdet/datasets/dataset_wrappers.py | https://arxiv.org/abs/1908.03195 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS/issues/89#issuecomment-516877042 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/bbox/samplers/score_hlr_sampler.py | https://arxiv.org/abs/1904.04821 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/fcos_head.py | https://github.com/tianzhi0549/FCOS | 源码实现 | -| 开发引入 | / | SSD/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/bbox_heads/sabl_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/paa_head.py | https://github.com/kkhoot/PAA/blob/master/paa_core | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/atss_head.py | https://arxiv.org/abs/1912.02424 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/necks/nasfcos_fpn.py | https://arxiv.org/abs/1906.04423 | 论文地址 | -| 开发引入 | / | SSD/mmdet/core/mask/structures.py | https://github.com/facebookresearch/detectron2/blob/ffff8acc35ea88ad1cb1806ab0f00b4c1c5dbfd9/detectron2/structures/masks.py#L387 | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/paa_head.py | https://arxiv.org/abs/2007.08103 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/detectors/htc.py | https://arxiv.org/abs/1901.07518 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/roi_extractors/generic_roi_extractor.py | https://arxiv.org/abs/2004.13665 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/balanced_l1_loss.py | https://arxiv.org/pdf/1904.02701.pdf | 论文地址 | -| 开发引入 | / | SSD/mmdet/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/losses/gaussian_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/double_roi_head.py | https://arxiv.org/abs/1904.06493 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/roi_heads/mask_heads/fcn_mask_head.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | SSD/mmdet/models/losses/ae_loss.py | https://arxiv.org/abs/1611.05424 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/sabl_retina_head.py | https://arxiv.org/abs/1912.04260 | 论文地址 | -| 开发引入 | / | SSD/mmdet/models/dense_heads/fcos_head.py | https://arxiv.org/abs/1904.01355 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD/configs/faster_rcnn/faster_rcnn_r50_fpn_1x_coco-person-bicycle-car.py | https://s3.ap-northeast-2.amazonaws.com/open-mmlab/mmdetection/models/faster_rcnn_r50_fpn_1x_20181010-3d1b3351.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | mmcv地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD/mmdet/datasets/lvis.py | http://images.cocodataset.org/ | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SSD/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/SimCLR_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/detection/SimCLR_for_Pytorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..d74bd48aeb6f75ce8a980649ebad051002ba80a0 --- /dev/null +++ b/PyTorch/contrib/cv/detection/SimCLR_for_Pytorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------|---------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/SimCLR_for_Pytorch/main_8p.py | 8.8.8.8 | ip地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/TextSnake/public_address_statement.md b/PyTorch/contrib/cv/detection/TextSnake/public_address_statement.md index f2852898891addd64fd4d69aa43ff9983bc90f1b..f189a6239e837877bea19e0911571858d78e5dc3 100644 --- a/PyTorch/contrib/cv/detection/TextSnake/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/TextSnake/public_address_statement.md @@ -1,15 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|--------------------------------------| ----------------------------------- |---------| -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/dataset/total_text/download.sh | https://drive.google.com/file/d/1bC68CzsSVTusZVvOkk7imSZSbgD1MqK2/view?usp=sharing | 下载依赖 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/dataset/total_text/download.sh | https://drive.google.com/file/d/19quCaJGePvTc3yPZ7MAGNijjKfy77-ke/view?usp=sharing | 下载依赖 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/dataset/total_text/gdrivedl.sh | https://docs.google.com/uc?export=download&id= | 下载依赖 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch | TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch/dataset/total_text/gdrivedl.sh | TextSnake/dataset/total_text/gdrivedl.sh | https://github.com/matthuisman/files.matthuisman.nz/blob/master/gdrivedl | 源码实现 | -| 开源代码引入 | https://github.com/princewang1994/TextSnake.pytorch/dataset/total_text/Evaluation_Protocol/Python_scripts/Deteval.py | TextSnake/dataset/total_text/Evaluation_Protocol/Python_scripts/Deteval.py | https://github.com/cs-chan/Total-Text-Dataset/blob/master/Evaluation_Protocol/Python_scripts/Deteval.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/network/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/TextSnake/util/option.py | tcp://224.66.41.62:23456 | ip地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/YOLACT_plus/public_address_statement.md b/PyTorch/contrib/cv/detection/YOLACT_plus/public_address_statement.md index 870189e3844f67338d5b705d80ad46bead879b2e..feabe921c3553a2a2a3bcf1a506a970b915c6544 100644 --- a/PyTorch/contrib/cv/detection/YOLACT_plus/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/YOLACT_plus/public_address_statement.md @@ -1,29 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|---------------------------------------| ---------------------------------- |-------| -| 开源代码引入 | https://github.com/dbolya/yolact.git | YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/dbolya/yolact.git | YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/dbolya/yolact.git | YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/annotations/annotations_trainval2014.zip | 下载数据集 | -| 开源代码引入 | https://github.com/dbolya/yolact.git | YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/dbolya/yolact.git | YOLACT_plus/data/scripts/COCO_test.sh | http://images.cocodataset.org/zips/test2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/dbolya/yolact.git | YOLACT_plus/data/scripts/COCO_test.sh | http://images.cocodataset.org/annotations/image_info_test2017.zip | 下载数据集 | -| 开发引入 | / | YOLACT_plus/layers/modules/multibox_loss.py | https://arxiv.org/pdf/1512.02325.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/box_utils.py | YOLACT_plus/layers/box_utils.py | https://arxiv.org/pdf/1612.08242.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/yolact.py | YOLACT_plus/yolact.py | https://discuss.pytorch.org/t/how-to-train-with-frozen-batchnorm/12106/8 | 相关说明 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/modules/multibox_loss.py | YOLACT_plus/layers/modules/multibox_loss.py | https://github.com/kuangliu/pytorch-retinanet/blob/master/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/yolact.py | YOLACT_plus/yolact.py | https://arxiv.org/pdf/1701.06659.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/external/DCNv2/src/cuda/dcn_v2_im2col_cuda.h | YOLACT_plus/deform_conv.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/modules/multibox_loss.py | YOLACT_plus/layers/modules/multibox_loss.py | https://github.com/clcarwin/focal_loss_pytorch/blob/master/focalloss.py | 源码实现 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/modules/multibox_loss.py | YOLACT_plus/layers/modules/multibox_loss.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/yolact.py | YOLACT_plus/yolact.py | https://arxiv.org/pdf/1612.03144.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/yolact.py | YOLACT_plus/yolact.py | https://github.com/pytorch/pytorch/issues/17108 | 源码实现 | -| 开发引入 | / | YOLACT_plus/data/coco.py | http://mscoco.org/dataset/#detections-challenge2016 | 数据集地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/backbone.py | YOLACT_plus/backbone.py | https://pjreddie.com/media/files/papers/YOLOv3.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/backbone.py | YOLACT_plus/backbone.py | https://github.com/pjreddie/darknet/blob/680d3bde1924c8ee2d1c1dea54d3e56a05ca9a26/src/activations.h#L39 | 源码实现 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/utils/augmentations.py | YOLACT_plus/utils/augmentations.py | https://github.com/amdegroot/ssd.pytorch/issues/68 | 源码实现 | -| 开发引入 | / | YOLACT_plus/utils/nvinfo.py | https://pypi.org/project/nvgpu/ | 相关依赖 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/box_utils.py | YOLACT_plus/layers/box_utils.py | https://lmb.informatik.uni-freiburg.de/Publications/2018/UB18/paper-box2pix.pdf | 论文地址 | -| 开发引入 | / | YOLACT_plus/deform_conv.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/ops/modulated_deform_conv.py | 源码实现 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/modules/multibox_loss.py | YOLACT_plus/layers/modules/multibox_loss.py | https://github.com/pytorch/pytorch/blob/master/modules/detectron/softmax_focal_loss_op.cu | 源码实现 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/data/config.py | YOLACT_plus/data/config.py | https://arxiv.org/abs/1903.00241 | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/layers/modules/multibox_loss.py | YOLACT_plus/data/config.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/dbolya/yolact.git/eval.py | YOLACT_plus/eval.py | https://stackoverflow.com/questions/664014/what-integer-hash-function-are-good-that-accepts-an-integer-hash-key | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/YOLACT_plus/data/scripts/COCO.sh | http://images.cocodataset.org/annotations/annotations_trainval2014.zip | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/YOLOR/public_address_statement.md b/PyTorch/contrib/cv/detection/YOLOR/public_address_statement.md index ed23011ff33cbca6a448e769d57e5f3867b056d0..9bd19f80b379100219f98794af0139d62a70b9ab 100644 --- a/PyTorch/contrib/cv/detection/YOLOR/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/YOLOR/public_address_statement.md @@ -1,71 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|--------------------------------------|----------------------------------------------------------------------|-------| -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/models/models.py | https://pjreddie.com/media/files/ | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/models/models.py | https://drive.google.com/open?id=1LezFG5g3BCW6iYaV89B2i64cqEUZD7e0 | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/scripts/get_coco.sh | https://github.com/ultralytics/yolov5/releases/download/v1.0/ | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/scripts/get_coco.sh | http://images.cocodataset.org/zips/ | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/scripts/get_pretrain.sh | https://drive.google.com/uc?export=download&id=1Tdn3yqpZ79X7R1Ql0zNlNScB1Dv9Fp76 | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/scripts/get_pretrain.sh | https://drive.google.com/uc?export=download&confirm | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/scripts/get_pretrain.sh | https://drive.google.com/uc?export=download&id=1UflcHlN5ERPdhahMivQYCbWWw7d2wY7U | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/utils/datasets.py | https://github.com/ultralytics/yolov5/wiki/Train-Custom-Data | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/utils/google_utils.py | https://github.com/WongKinYiu/yolor/releases/ | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor | YOLOR/utils/google_utils.py | https://github.com/WongKinYiu/yolor/releases/download/v1.0/ | 下载依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/models/models.py | YOLOR/models/models.py | https://arxiv.org/abs/1911.09516 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/layers.py | https://arxiv.org/abs/1907.09595 | 论文地址 | -| 开发引入 | / | YOLOR/utils/google_utils.py | https://drive.google.com/uc?export=download&id=%s | 相关依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/google_utils.py | YOLOR/utils/google_utils.py | https://cloud.google.com/storage/docs/uploading-objects#storage-upload-object-python | 相关依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/layers.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/datasets.py | YOLOR/utils/datasets.py | https://github.com/ultralytics/yolov3/issues/232 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/models/models.py | YOLOR/models/models.py | https://github.com/ultralytics/yolov3/issues/931 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/tune.py | YOLOR/train_mp.py | https://github.com/ultralytics/yolov5/pull/1120 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/models/models.py | YOLOR/models/models.py | https://github.com/ultralytics/yolov3/issues/441 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/tune.py | YOLOR/train_mp.py | https://pytorch.org/docs/stable/_modules/torch/optim/lr_scheduler.html#OneCycleLR | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/general.py | YOLOR/utils/general.py | https://arxiv.org/pdf/1902.09630.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/general.py | YOLOR/utils/general.py | https://arxiv.org/abs/2101.08158 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/test.py | YOLOR/test.py | https://github.com/cocodataset/cocoapi/blob/master/PythonAPI/pycocoEvalDemo.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/tune.py | YOLOR/train.py | https://github.com/ultralytics/yolov5/pull/1120 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/tune.py | YOLOR/train.py | https://pytorch.org/docs/stable/_modules/torch/optim/lr_scheduler.html#OneCycleLR | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/general.py | YOLOR/utils/general.py | https://arxiv.org/abs/1911.08287v1 | 论文地址 | -| 开发引入 | / | YOLOR/utils/general.py | https://github.com/ultralytics/yolov3/issues/1139 | 源码实现 | -| 开发引入 | / | YOLOR/utils/loss.py | https://arxiv.org/pdf/1902.04103.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/layers.py | https://arxiv.org/pdf/1905.02244.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/scripts/get_coco.sh | YOLOR/scripts/get_coco.sh | http://cocodataset.org | 数据集地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/activations.py | https://arxiv.org/pdf/1905.02244.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/tune.py | YOLOR/train.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/metrics.py | YOLOR/utils/metrics.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/models/models.py | YOLOR/models/models.py | https://pytorch.org/docs/stable/torchvision/models.html#classification | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/models/models.py | YOLOR/models/models.py | https://github.com/AlexeyAB/darknet/issues/2914#issuecomment-496675346 | 源码实现 | -| 开发引入 | / | YOLOR/train_mp.py | https://github.com/NVIDIA/apex | 相关依赖 | -| 开发引入 | / | YOLOR/utils/torch_utils.py | https://github.com/NVIDIA/apex | 相关依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/torch_utils.py | YOLOR/utils/torch_utils.py | https://github.com/rwightman/pytorch-image-models | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/datasets.py | YOLOR/utils/datasets.py | https://arxiv.org/abs/1708.04552 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/tune.py | YOLOR/train_mp.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/metrics.py | YOLOR/utils/metrics.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/google_utils.py | YOLOR/utils/google_utils.py | https://cloud.google.com/storage/docs/gsutil/commands/du | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/general.py | YOLOR/utils/general.py | https://github.com/Zzh-tju/DIoU-SSD-pytorch/blob/master/utils/box/box_utils.py#L47 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/models/export.py | YOLOR/models/export.py | https://github.com/lutzroeder/netron | 源码实现 | -| 开发引入 | / | YOLOR/utils/datasets.py | http://wmccpinetop.axiscam.net/mjpg/video.mjpg | 数据集地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/general.py | YOLOR/utils/general.py | https://tech.amikelive.com/node-718/what-object-categories-labels-are-in-coco-dataset/ | 数据集地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/plots.py | YOLOR/utils/plots.py | https://github.com/ultralytics/yolov3/issues/168 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/datasets.py | YOLOR/utils/datasets.py | https://arxiv.org/pdf/1710.09412.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/general.py | YOLOR/utils/general.py | https://github.com/pytorch/vision/blob/master/torchvision/ops/boxes.py | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/loss.py | YOLOR/utils/loss.py | https://github.com/ultralytics/yolov3/issues/238#issuecomment-598028441 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/metrics.py | YOLOR/utils/metrics.py | https://github.com/ultralytics/yolov3/issues/898 | 源码实现 | -| 开发引入 | / | YOLOR/utils/plots.py | https://storage.googleapis.com/%s/results%g.txt | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/plots.py | YOLOR/utils/plots.py | https://stackoverflow.com/questions/28536191/how-to-filter-smooth-with-scipy-numpy | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/layers.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/activations.py | https://github.com/digantamisra98/Mish | 源码实现 | -| 开发引入 | / | YOLOR/scripts/get_pretrain.sh | https://drive.google.com/uc?export=download&confirm= | 相关依赖 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/layers.py | YOLOR/utils/layers.py | https://arxiv.org/abs/1911.09070 | 论文地址 | -| 开发引入 | / | YOLOR/models/models.py | https://arxiv.org/pdf/1708.02002.pdf | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/google_utils.py | YOLOR/utils/google_utils.py | https://cloud.google.com/storage/docs/reference/libraries | 相关依赖 | -| 开发引入 | / | YOLOR/utils/parse_config.py | https://github.com/ultralytics/yolov3/issues/631 | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/activations.py | YOLOR/utils/activations.py | https://arxiv.org/abs/2007.11824 | 论文地址 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/loss.py | YOLOR/utils/loss.py | https://github.com/tensorflow/addons/blob/v0.7.1/tensorflow_addons/losses/focal_loss.py | 源码实现 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/torch_utils.py | YOLOR/utils/torch_utils.py | https://pytorch.org/docs/stable/notes/randomness.html | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/torch_utils.py | YOLOR/utils/torch_utils.py | https://www.tensorflow.org/api_docs/python/tf/train/ExponentialMovingAverage | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/torch_utils.py | YOLOR/utils/torch_utils.py | https://tehnokv.com/posts/fusing-batchnorm-and-conv/ | 相关说明 | -| 开源代码引入 | https://github.com/WongKinYiu/yolor/utils/plots.py | YOLOR/utils/plots.py | https://stackoverflow.com/questions/51350872/python-from-color-name-to-rgb | 相关说明 | -| 开发引入 | / | YOLOR/train.py | https://github.com/NVIDIA/apex | 相关依赖 | -| 开发引入 | / | YOLOR/requirements-GPU.txt | https://github.com/NVIDIA/apex | 相关依赖 | -| 开发引入 | / | YOLOR/requirements-GPU.txt | https://github.com/onnx/onnx#linux-and-macos | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------|-------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/YOLOR/models/models.py | https://pjreddie.com/media/files/ | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/YOLOR/scripts/get_coco.sh | http://images.cocodataset.org/zips/ | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/detection/centernet2/public_address_statement.md b/PyTorch/contrib/cv/detection/centernet2/public_address_statement.md index 493707103a6d80068be162479324a9c80764772d..ac10e29e31c63da178944a33f09d9ffe12023325 100644 --- a/PyTorch/contrib/cv/detection/centernet2/public_address_statement.md +++ b/PyTorch/contrib/cv/detection/centernet2/public_address_statement.md @@ -1,157 +1,16 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------------------------------------------------------------------------|----------------------------------------------|--------------------------------------------|------------| -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/datasets/prepare_for_tests.sh | centernet2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/datasets/prepare_panoptic_fpn.py | centernet2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载数据集 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/checkpoint/catalog.py | centernet2/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron | 下载权重文件 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/model_zoo/model_zoo.py | centernet2/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/file_io.py | centernet2/detectron2/utils/file_io.py | https://dl.fbaipublicfiles.com/detectron2/ | 源码地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/dev/packaging/build_wheel.sh | centernet2/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/ | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/dev/packaging/gen_install_table.py | centernet2/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docker/deploy.Dockerfile | centernet2/docker/deploy.Dockerfile | https://github.com/protocolbuffers/protobuf/releases/download/v3.11.4/protobuf-cpp-3.11.4.tar.gz | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docker/deploy.Dockerfile | centernet2/docker/deploy.Dockerfile | https://github.com/pytorch/vision/ | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docker/Dockerfile | centernet2/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docker/Dockerfile | centernet2/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docker/Dockerfile | centernet2/docker/Dockerfile | https://github.com/facebookresearch/fvcore | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docker/Dockerfile | centernet2/docker/Dockerfile | https://github.com/facebookresearch/detectron2 | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | https://github.com/facebookresearch/detectron2/blob/master/ | 源码地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | https://docs.python.org/3.6 | 第三方包源码说明文档 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | https://docs.scipy.org/doc/numpy/ | 第三方包源码说明文档 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | https://pytorch.org/docs/master/ | 第三方包源码说明文档 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | https://arxiv.org/abs/ | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/requirements.txt | centernet2/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp37-cp37m-linux_x86_64.whl | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/requirements.txt | centernet2/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torchvision-0.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl | 下载第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/setup.py | centernet2/setup.py | https://github.com/facebookresearch/detectron2 | 安装第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/setup.py | centernet2/setup.py | https://github.com/cocodataset/panopticapi/archive/master.zip | 安装第三方包 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/data/test_coco_evaluation.py | centernet2/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg | 下载测试数据 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/data/test_coco_evaluation.py | centernet2/tests/data/test_coco_evaluation.py | http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg | 下载测试数据 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/data/test_coco_evaluation.py | centernet2/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000139.jpg | 下载测试数据 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/data/test_coco_evaluation.py | centernet2/tests/data/test_coco_evaluation.py | http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 下载测试数据 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/test_model_zoo.py | centernet2/tests/test_model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载权重文件 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/vision.cpp | centernet2/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/lvis.py | centernet2/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开发引入 | / | centernet2/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/structures/masks.py | centernet2/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/wrappers.py | centernet2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/wrappers.py | centernet2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/40507 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/detection_utils.py | centernet2/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关说明 | -| 开发引入 | / | centernet2/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/lvis.py | centernet2/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/vision.cpp | centernet2/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开发引入 | / | centernet2/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/modeling/test_matcher.py | centernet2/detectron2/export/torchscript_patch.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/engine/launch.py | centernet2/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开发引入 | / | centernet2/detectron2/projects/point_rend/color_augmentation.py | https://github.com/weiliu89/caffe/blob | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/detection_utils.py | centernet2/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/res2net.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/res2net.py | https://github.com/Res2Net/Res2Net-detectron2/blob/master/detectron2/modeling/backbone/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/export/shared.py | centernet2/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/env.py | centernet2/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关说明 | -| 开发引入 | / | centernet2/detectron2/projects/point_rend/color_augmentation.py | https://github.com/chainer/chainercv/blob | 源码实现 | -| 开发引入 | / | centernet2/detectron2/projects/deeplab/lr_scheduler.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/utils/train_utils.py#L337 | 源码实现 | -| 开发引入 | / | centernet2/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/setup.py | centernet2/setup.py | https://github.com/skvark/opencv-python | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/.flake8 | centernet2/detectron2/modeling/roi_heads/roi_heads.py | https://github.com/pytorch/pytorch/issues/41448 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/structures/boxes.py | centernet2/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/coco.py | centernet2/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/layers/test_roi_align.py | centernet2/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/modeling/poolers.py | centernet2/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/config/defaults.py | centernet2/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关说明 | -| 开发引入 | / | centernet2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/lvis.py | centernet2/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/test_export_torchscript.py | centernet2/tests/test_export_torchscript.py | https://github.com/pytorch/pytorch/issues/46944 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/modeling/proposal_generator/rrpn.py | centernet2/detectron2/modeling/proposal_generator/proposal_utils.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/coco.py | centernet2/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/tutorials/datasets.md | centernet2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/dev/packaging/pkg_helpers.bash | centernet2/dev/packaging/pkg_helpers.bash | https://github.com/pytorch/pytorch/blob/master/torch/utils/cpp_extension.py#L1363 | 源码实现 | -| 开发引入 | / | centernet2/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg","http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg","http://images.cocodataset.org/val2017/000000000139.jpg","http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 图片地址 | -| 开发引入 | / | centernet2/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/bifpn.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py | https://arxiv.org/abs/1710.05941 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/modeling/proposal_generator/rrpn.py | centernet2/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/batch_norm.py | centernet2/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/modeling/test_roi_pooler.py | centernet2/tests/modeling/test_roi_pooler.py | https://github.com/pytorch/pytorch/issues/49852 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开发引入 | / | centernet2/detectron2/projects/deeplab/loss.py | https://github.com/tensorflow/models/blob/bd488858d610e44df69da6f89277e9de8a03722c/research/deeplab/utils/train_utils.py#L33 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/engine/train_loop.py | centernet2/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/engine/launch.py | centernet2/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/modeling/roi_heads/roi_heads.py | https://github.com/pytorch/pytorch/issues/46703 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/conf.py | centernet2/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/evaluation/sem_seg_evaluation.py | centernet2/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/transforms/transform.py | centernet2/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/transforms/augmentation_impl.py | centernet2/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关说明 | -| 开发引入 | / | centernet2/detectron2/projects/panoptic_deeplab/target_generator.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/createPanopticImgs.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/evaluation/sem_seg_evaluation.py | centernet2/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 相关说明 | -| 开发引入 | / | centernet2/detectron2/projects/panoptic_deeplab/target_generator.py | https://github.com/bowenc0221/panoptic-deeplab/blob/aa934324b55a34ce95fea143aea1cb7a6dbe04bd/segmentation/data/transforms/target_transforms.py#L11 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/modeling/proposal_generator/rrpn.py | centernet2/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/projects/__init__.py | centernet2/detectron2/projects/__init__.py | https://github.com/pypa/setuptools/issues/230 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/layers/iou_loss.py | centernet2/projects/CenterNet2/centernet/modeling/layers/iou_loss.py | https://arxiv.org/abs/1902.09630 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/dla.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/detection_utils.py | centernet2/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/bifpn_fcos.py | https://github.com/aim-uofa/AdelaiDet/blob/master/adet/modeling/backbone/bifpn.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | http://dl.yf.io/dla/models | 预训练模型 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/transforms/transform.py | centernet2/detectron2/data/transforms/augmentation.py | https://detectron2.readthedocs.io/tutorials/augmentation.html | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/setup.py | centernet2/setup.py | https://github.com/ppwwyyxx/cocoapi | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/builtin_meta.py | centernet2/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/datasets/prepare_cocofied_lvis.py | centernet2/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/dev/packaging/build_wheel.sh | centernet2/dev/packaging/build_wheel.sh | https://github.com/NVIDIA/nvidia-docker/issues/854 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/structures/image_list.py | centernet2/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/42448 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | centernet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/notes/compatibility.md | centernet2/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/detection_utils.py | centernet2/detectron2/data/detection_utils.py | https://www.exiv2.org/tags.html | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/logger.py | centernet2/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/transforms/transform.py | centernet2/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/data/test_detection_utils.py | centernet2/tests/data/test_detection_utils.py | https://github.com/recurser/exif-orientation-examples/raw/master/Landscape_5.jpg | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/evaluation/pascal_voc_evaluation.py | centernet2/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/transforms/transform.py | centernet2/detectron2/data/transforms/transform.py | https://detectron2.readthedocs.io/tutorials/augmentation.html | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py | centernet2/projects/CenterNet2/centernet/modeling/roi_heads/custom_fast_rcnn.py | https://github.com/tztztztztz/eql.detectron2/blob/master/projects/EQL/eql/fast_rcnn.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/datasets/coco.py | centernet2/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tools/deploy/README.md | centernet2/tests/test_export_torchscript.py | https://detectron2.readthedocs.io/tutorials/deployment.html | 相关说明 | -| 开发引入 | / | centernet2/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | centernet2/detectron2/modeling/roi_heads/roi_heads.py | https://github.com/pytorch/pytorch/issues/43942 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/detection_utils.py | centernet2/detectron2/data/detection_utils.py | https://github.com/facebookresearch/detectron2/issues/1885 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | centernet2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/modeling/proposal_generator/rrpn.py | centernet2/detectron2/modeling/proposal_generator/proposal_utils.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/47405 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/wrappers.py | centernet2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/modeling/test_rpn.py | centernet2/tests/modeling/test_rpn.py | https://github.com/pytorch/pytorch/issues/46964 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/.flake8 | centernet2/.flake8 | https://github.com/pytorch/pytorch/issues/41448 | 相关说明 | -| 开发引入 | / | centernet2/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/config/defaults.py | centernet2/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开发引入 | / | centernet2/detectron2/structures/keypoints.py | https://github.com/pytorch/pytorch/pull/41371 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py | centernet2/projects/CenterNet2/centernet/modeling/layers/heatmap_focal_loss.py | https://arxiv.org/abs/1708.02002 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/roi_align.py | centernet2/detectron2/layers/roi_align_old.py | https://github.com/pytorch/vision/pull/2438 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/export/torchscript_patch.py | centernet2/detectron2/export/torchscript_patch.py | https://pytorch.org/docs/stable/jit_language_reference.html#optional-type-refinement | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/docs/tutorials/datasets.md | centernet2/detectron2/config/defaults.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/model_zoo/__init__.py | centernet2/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/modeling/proposal_generator/proposal_utils.py | centernet2/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/47379 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/bifpn.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/bifpn.py | https://github.com/rwightman/efficientdet-pytorch/blob/master/effdet/efficientdet.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/transforms/transform.py | centernet2/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关说明 | -| 开发引入 | / | centernet2/detectron2/export/torchscript_patch.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | centernet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | https://github.com/ucbdrive/dla/blob/master/dla.py | 源码实现 | -| 开发引入 | / | centernet2/docker/Dockerfile | http://images.cocodataset.org/val2017/000000439715.jpg | 图片地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/serialize.py | centernet2/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/engine/defaults.py | centernet2/detectron2/engine/defaults.py | https://detectron2.readthedocs.io/modules/config.html#config-references | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/export/torchscript.py | centernet2/detectron2/export/torchscript.py | https://pytorch.org/docs/stable/jit.html#inspecting-code | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | centernet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/visualizer.py | centernet2/detectron2/utils/visualizer.py | https://stackoverflow.com/questions/8919719/how-to-plot-a-complex-polygon | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/structures/boxes.py | centernet2/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开发引入 | / | centernet2/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/ | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/data/detection_utils.py | centernet2/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/setup.py | centernet2/setup.py | https://github.com/pytorch/pytorch/pull/43931 | 源码实现 | -| 开发引入 | / | centernet2/detectron2/data/datasets/builtin_meta.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/helpers/labels.py | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | centernet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | centernet2/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开发引入 | / | centernet2/detectron2/projects/panoptic_deeplab/post_processing.py | https://github.com/bowenc0221/panoptic-deeplab/blob/master/segmentation/model/post_processing/instance_post_processing.py | 源码实现 | -| 开发引入 | / | centernet2/detectron2/projects/panoptic_deeplab/target_generator.py | https://github.com/facebookresearch/detectron2/blob/master/datasets/prepare_panoptic_fpn.py#L18 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/env.py | centernet2/detectron2/utils/env.py | https://github.com/skvark/opencv-python/issues/381 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/engine/defaults.py | centernet2/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/batch_norm.py | centernet2/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 源码实现 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | centernet2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/tests/modeling/test_matcher.py | centernet2/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/xingyizhou/CenterNet2/blob/68c0a468254b013e1d08309cd7a506756120ca62/detectron2/utils/visualizer.py | centernet2/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开发引入 | / | centernet2/docs/requirements.txt | https://github.com/sphinx-doc/sphinx/commit/7acd3ada3f38076af7b2b5c9f3b60bb9c2587a3d | 相关依赖 | -| 开发引入 | / | centernet2/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp37-cp37m-linux_x86_64.whl | 相关依赖 | -| 开发引入 | / | centernet2/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torchvision-0.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl | 相关依赖 | -| 开发引入 | / | centernet2/docs/requirements.txt | https://pytorch.org/tutorials/advanced/cpp_frontend.html | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/detectron2/engine/defaults.py | https://detectron2.readthedocs.io/modules/config.html#config-references | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/detectron2/utils/file_io.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/projects/CenterNet2/centernet/modeling/backbone/dla.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | http://dl.yf.io/dla/models | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/projects/CenterNet2/centernet/modeling/backbone/dlafpn.py | http://dl.yf.io/dla/models | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/detection/centernet2/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/3D_Nested_Unet/public_address_statement.md b/PyTorch/contrib/cv/others/3D_Nested_Unet/public_address_statement.md index 8ac0e9b3f6df7302aa2c49d5ef428ae6325c4a4a..3a63b01820d5dae4f806069d7a7912fb831aeb50 100644 --- a/PyTorch/contrib/cv/others/3D_Nested_Unet/public_address_statement.md +++ b/PyTorch/contrib/cv/others/3D_Nested_Unet/public_address_statement.md @@ -1,10 +1,4 @@ - | 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-------------------------------------| ------------------------------------ |------| -| 开发引入 | / | 3D_Nested_Unet/requirements.txt | https://github.com/MrGiovanni/UNetPlusPlus.git@e145ba63862982bf1099cf2ec11d5466b434ae0b#egg=nnunet&subdirectory=pytorch | 下载依赖 | - | 开发引入 | / | 3D_Nested_Unet/requirements_gpu.txt | file:///home/conda/feedstock_root/build_artifacts/jieba_1622403007736/work | 下载依赖 | - | 开发引入 | / | 3D_Nested_Unet/requirements_gpu.txt | https://github.com/MrGiovanni/UNetPlusPlus.git@e145ba63862982bf1099cf2ec11d5466b434ae0b#egg=nnunet&subdirectory=pytorch | 下载依赖 | - | 开源代码引入 | https://github.com/MrGiovanni/UNetPlusPlus/tree/master/pytorch/nnunet/network_architecture/generic_XNet.py | 3D_Nested_Unet/new_npu.patch | f.isensee@dkfz.de | 邮箱地址 | -| 开源代码引入 | https://github.com/MrGiovanni/UNetPlusPlus/tree/master/pytorch/setup.py | 3D_Nested_Unet/new_gpu.patch | f.isensee@dkfz-heidelberg.de | 邮箱地址 | -| 开源代码引入 | https://github.com/MrGiovanni/UNetPlusPlus/tree/master/pytorch/setup.py | 3D_Nested_Unet/new_npu.patch | f.isensee@dkfz-heidelberg.de | 邮箱地址 | -| 开发引入 | / | 3D_Nested_Unet/requirements.txt | https://github.com/MrGiovanni/UNetPlusPlus.git@e145ba63862982bf1099cf2ec11d5466b434ae0b#egg=nnunet&subdirectory=pytorch | 相关依赖 | -| 开发引入 | / | 3D_Nested_Unet/requirements_gpu.txt | https://github.com/MrGiovanni/UNetPlusPlus.git@e145ba63862982bf1099cf2ec11d5466b434ae0b#egg=nnunet&subdirectory=pytorch | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------|------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/3D_Nested_Unet/new_gpu.patch | f.isensee@dkfz-heidelberg.de | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/3D_Nested_Unet/new_npu.patch | f.isensee@dkfz-heidelberg.de | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/BigGAN/public_address_statement.md b/PyTorch/contrib/cv/others/BigGAN/public_address_statement.md index f6010af9733a949b494a093266950b7a2acf9e8d..d4b5b3e243d9c1e036a3b6d004e3259217d989bf 100644 --- a/PyTorch/contrib/cv/others/BigGAN/public_address_statement.md +++ b/PyTorch/contrib/cv/others/BigGAN/public_address_statement.md @@ -1,23 +1,5 @@ - | 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|--------------------------| ------------------------------------ |-------| -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | BigGAN/datasets.py | http://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 下载数据集 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | BigGAN/inception_tf13.py | http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz | 下载数据集 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/batchnorm.py | maojiayuan@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/batchnorm.py | BigGAN/sync_batchnorm/batchnorm.py | http://tetexiao.com/ | 相关说明 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/__init__.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/inception_utils.py | BigGAN/inception_utils.py | https://github.com/msubhransu/matrix-sqrt | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/comm.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/utils.py | BigGAN/utils.py | https://discuss.pytorch.org/t/subclassing-torch-tensor/23754/2 | 相关说明 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/inception_utils.py | BigGAN/inception_utils.py | https://github.com/bioinf-jku/TTUR | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/unittest.py | maojiayuan@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/__init__.py | maojiayuan@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/batchnorm_reimpl.py | BigGAN/sync_batchnorm/batchnorm_reimpl.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch/issues/14 | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/comm.py | maojiayuan@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/replicate.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/README.md | BigGAN/inception_tf13.py | https://github.com/openai/improved-gan | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/batchnorm.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/unittest.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/inception_utils.py | BigGAN/inception_utils.py | https://discuss.pytorch.org/t/covariance-and-gradient-support/16217/2 | 相关说明 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/batchnorm_reimpl.py | https://github.com/vacancy/Synchronized-BatchNorm-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/sync_batchnorm/__init__.py | BigGAN/sync_batchnorm/replicate.py | maojiayuan@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch/datasets.py | BigGAN/datasets.py | https://github.com/python-pillow/Pillow/issues/835 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------|-------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/BigGAN/datasets.py | http://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/BigGAN/inception_tf13.py | http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/BigGAN/utils.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/CenterMask2/public_address_statement.md b/PyTorch/contrib/cv/others/CenterMask2/public_address_statement.md index 339227a559783dd8bc7de0e899555af7c06e85ac..911512a17b259c3182fe8602d77b55128488c4fc 100644 --- a/PyTorch/contrib/cv/others/CenterMask2/public_address_statement.md +++ b/PyTorch/contrib/cv/others/CenterMask2/public_address_statement.md @@ -1,96 +1,25 @@ - | 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|--------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/youngwanLEE/CenterMask | CenterMask2/models/centermask2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载依赖 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | CenterMask2/models/centermask2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | CenterMask2/models/detectron2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载依赖 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | CenterMask2/models/detectron2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | CenterMask2/models/detectron2/setup.py | https://github.com/facebookresearch/detectron2 | 下载依赖 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | CenterMask2/models/detectron2/setup.py | https://github.com/psf/black@673327449f86fce558adde153bb6cbe54bfebad2 | 下载依赖 | - | 开源代码引入 | https://github.com/ajbrock/BigGAN-PyTorch | CenterMask2/models/detectron2/tests/test_model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载预训练权重 | - | 开源代码引入 | https://github.com/youngwanLEE/CenterMask/maskrcnn_benchmark/csrc/cuda/deform_conv_kernel_cuda.cu | CenterMask2/models/detectron2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 预训练模型 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片地址 | -| 开发引入 | / | CenterMask2/models/detectron2/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg","http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg","http://images.cocodataset.org/val2017/000000000139.jpg","http://farm9.staticflickr.com/8035/8024364858_9c41dc1666_z.jpg | 图片地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/31734 | 相关说明 | -| 开源代码引入 | https://github.com/youngwanLEE/CenterMask/maskrcnn_benchmark/modeling/backbone/mobilenet.py | CenterMask2/models/centermask2/centermask/modeling/backbone/mobilenet.py | https://github.com/tonylins/pytorch-mobilenet-v2/ | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关依赖 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/pull/41371 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/register_coco.py | http://cocodataset.org/#format-data | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关依赖 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关依赖 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关依赖 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/export/torchscript.py | https://docs.python.org/3/library/importlib.html#importing-a-source-file-directly | 相关依赖 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片地址 | -| 开发引入 | / | CenterMask2/models/detectron2/tests/structures/test_boxes.py | https://github.com/pytorch/pytorch/pull/39336 | 源码实现 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/structures/image_list.py | https://github.com/pytorch/pytorch/issues/39308 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开源代码引入 | https://github.com/youngwanLEE/CenterMask/maskrcnn_benchmark/layers/misc.py | CenterMask2/models/detectron2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 源码实现 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关依赖 | -| 开发引入 | / | CenterMask2/models/detectron2/tests/modeling/test_matcher.py | https://github.com/pytorch/pytorch/pull/38378 | 源码实现 | -| 开源代码引入 | https://github.com/youngwanLEE/CenterMask/maskrcnn_benchmark/csrc/cuda/deform_conv_cuda.cu | CenterMask2/models/detectron2/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/modeling/proposal_generator/rpn.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/34202 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/config/defaults.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 相关依赖 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | -| 开源代码引入 | https://github.com/youngwanLEE/CenterMask/maskrcnn_benchmark/csrc/cuda/deform_conv_kernel_cuda.cu | CenterMask2/models/detectron2/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/detection_utils.py | https://www.exiv2.org/tags.html | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关说明 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/export/torchscript.py | https://github.com/pytorch/pytorch/issues/41449 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 数据集地址 | -| 开发引入 | / | CenterMask2/models/centermask2/centermask/utils/measures.py | https://github.com/ShichenLiu/CondenseNet/blob/master/utils.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 相关说明 | -| 开源代码引入 | https://github.com/youngwanLEE/CenterMask/maskrcnn_benchmark/structures/boxlist_ops.py | CenterMask2/models/detectron2/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 数据集地址 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开发引入 | / | CenterMask2/models/detectron2/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开发引入 | / | CenterMask2/models/detectron2/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/centermask/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_lite_Mv2_FPN_ms_4x.yaml | https://www.dropbox.com/s/yduxbc13s3ip6qn/mobilenet_v2_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_lite_V_19_eSE_FPN_ms_4x.yaml | https://www.dropbox.com/s/rptgw6stppbiw1u/vovnet19_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_lite_V_19_slim_dw_eSE_FPN_ms_4x.yaml | https://www.dropbox.com/s/f3s7ospitqoals1/vovnet19_ese_slim_dw_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_lite_V_19_slim_eSE_FPN_ms_4x.yaml | https://www.dropbox.com/s/8h5ybmi4ftbcom0/vovnet19_ese_slim_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_lite_V_39_eSE_FPN_ms_4x.yaml | https://www.dropbox.com/s/q98pypf96rhtd8y/vovnet39_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_V_39_eSE_dcn_FPN_ms_3x.yaml | https://www.dropbox.com/s/q98pypf96rhtd8y/vovnet39_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_V_39_eSE_FPN_ms_3x.yaml | https://www.dropbox.com/s/q98pypf96rhtd8y/vovnet39_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_V_57_eSE_dcn_FPN_ms_3x.yaml | https://www.dropbox.com/s/8xl0cb3jj51f45a/vovnet57_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_V_57_eSE_FPN_ms_3x.yaml | https://www.dropbox.com/s/8xl0cb3jj51f45a/vovnet57_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_V_99_eSE_dcn_FPN_ms_3x.yaml | https://www.dropbox.com/s/1mlv31coewx8trd/vovnet99_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/centermask_V_99_eSE_FPN_ms_3x.yaml | https://www.dropbox.com/s/1mlv31coewx8trd/vovnet99_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/panoptic_centermask_V_39_eSE_FPN_ms_3x.yaml | https://www.dropbox.com/s/q98pypf96rhtd8y/vovnet39_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/panoptic_centermask_V_57_eSE_FPN_ms_3x.yaml | https://www.dropbox.com/s/8xl0cb3jj51f45a/vovnet57_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/configs/centermask/panoptic_centermask_V_99_eSE_FPN_ms_3x.yaml | https://www.dropbox.com/s/1mlv31coewx8trd/vovnet99_ese_detectron2.pth?dl=1 | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/centermask2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/detectron2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/detectron2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/detectron2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/detectron2/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/detectron2/detectron2/utils/file_io.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/CenterMask2/models/detectron2/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/DCGAN/public_address_statement.md b/PyTorch/contrib/cv/others/DCGAN/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..e29ca0e45ee5db5a9df6d54805a830ac8f15f2ee --- /dev/null +++ b/PyTorch/contrib/cv/others/DCGAN/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/DCGAN/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/public_address_statement.md b/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/public_address_statement.md index 37e3225aba2f1a217e64cb7babe1e565efb7d33e..f67b6f8c563518f4d0fbe7aff99838529971f57e 100644 --- a/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/public_address_statement.md @@ -1,16 +1,10 @@ - | 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|--------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/deeplab.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/deeplab.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/deeplab.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://drive.google.com/uc?id=1MsXN54hPi9PWDmn1HKdmKfv-J5hWYFVZ | 下载依赖 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/males_model.zip | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://drive.google.com/uc?id=1LNm0zAuiY0CIJnI0lHTq1Ttcu9_M1NAJ | 下载依赖 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/females_model.zip | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://drive.google.com/uc?id=1oRGgrI4KNdefbWVpw0rRkEP1gbJIRokM | 下载依赖 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/R-101-GN-WS.pth.tar | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://drive.google.com/uc?id=1w2XjDywFr2NjuUWaLQDRktH7VwIfuNlY | 下载依赖 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/deeplab_model.pth | 下载预训练权重 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://drive.google.com/uc?id=1fhq5lvWy-rjrzuHdMoZfLsULvF0gJGwD | 下载依赖 | - | 开源代码引入 | https://github.com/royorel/Lifespan_Age_Transformation_Synthesis | Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/shape_predictor_68_face_landmarks.dat | 下载预训练权重 | -| 开发引入 | / | Lifespan_ID2972_for_pytorch/util/deeplab.py | https://github.com/chenxi116/DeepLabv3.pytorch | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/deeplab.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/deeplab.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/deeplab.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/R-101-GN-WS.pth.tar | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/deeplab_model.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/males_model.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/females_model.zip | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Lifespan_ID2972_for_pytorch/util/util.py | https://grail.cs.washington.edu/projects/lifespan_age_transformation_synthesis/pretrained_models/shape_predictor_68_face_landmarks.dat | 模型权重 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/Pix2Pix/public_address_statement.md b/PyTorch/contrib/cv/others/Pix2Pix/public_address_statement.md index f962d820b9056b07045cba060f464e61efb100d3..7d3f11612fa755324db85ccf25056860b72ffa34 100644 --- a/PyTorch/contrib/cv/others/Pix2Pix/public_address_statement.md +++ b/PyTorch/contrib/cv/others/Pix2Pix/public_address_statement.md @@ -1,39 +1,12 @@ - | 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/datasets/download_cyclegan_dataset.sh | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/ | 下载数据集 | - | 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/datasets/download_pix2pix_dataset.sh | http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/ | 下载数据集 | - | 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/scripts/download_cyclegan_model.sh | http://efrosgans.eecs.berkeley.edu/cyclegan/pretrained_models/ | 下载预训练模型 | - | 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/scripts/download_pix2pix_model.sh | http://efrosgans.eecs.berkeley.edu/pix2pix/models-pytorch/ | 下载依赖 | - | 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/scripts/eval_cityscapes/download_fcn8s.sh | http://people.eecs.berkeley.edu/~tinghuiz/projects/pix2pix/fcn-8s-cityscapes/fcn-8s-cityscapes.caffemodel | 下载预训练模型 | - | 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/util/get_data.py | http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/ | 下载数据集 | - | 开源代码引入 | https://gitee.com/iiiimp/modelzoo/tree/master/contrib/PyTorch/Research/cv/gan/Pix2Pix | Pix2Pix/util/get_data.py | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets | 下载数据集 | - | 开发引入 | / | Pix2Pix/models/pix2pix_model.py | https://pytorch.org/docs/stable/elastic/run.html | 相关说明 | -| 开发引入 | / | Pix2Pix/models/pix2pix_model.py | https://arxiv.org/pdf/1611.07004.pdf | 论文地址 | -| 开发引入 | / | Pix2Pix/train.py | https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/docs/tips.md | 相关说明 | -| 开发引入 | / | Pix2Pix/test.py | https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/docs/qa.md | 相关说明 | -| 开发引入 | / | Pix2Pix/scripts/edges/PostprocessHED.m | https://pdollar.github.io/toolbox/ | 相关说明 | -| 开发引入 | / | Pix2Pix/models/colorization_model.py | https://arxiv.org/pdf/1611.07004.pdf | 论文地址 | -| 开发引入 | / | Pix2Pix/scripts/edges/batch_hed.py | https://github.com/s9xie/hed | 源码实现 | -| 开发引入 | / | Pix2Pix/datasets/prepare_cityscapes_dataset.py | https://cityscapes-dataset.com | 数据集地址 | -| 开发引入 | / | Pix2Pix/scripts/eval_cityscapes/cityscapes.py | https://github.com/shelhamer/clockwork-fcn | 源码实现 | -| 开发引入 | / | Pix2Pix/models/networks.py | https://arxiv.org/pdf/1512.03385.pdf | 论文地址 | -| 开发引入 | / | Pix2Pix/pix2pix_pth2onnx.py | https://pytorch.org/docs/stable/elastic/run.html | 相关说明 | -| 开发引入 | / | Pix2Pix/train.py | https://pytorch.org/docs/stable/elastic/run.html | 相关说明 | -| 开发引入 | / | Pix2Pix/models/networks.py | https://arxiv.org/abs/1704.00028 | 论文地址 | -| 开发引入 | / | Pix2Pix/models/networks.py | https://pytorch.org/docs/stable/optim.html | 相关说明 | -| 开发引入 | / | Pix2Pix/data/image_folder.py | https://github.com/pytorch/vision/blob/master/torchvision/datasets/folder.py | 源码实现 | -| 开发引入 | / | Pix2Pix/datasets/download_cyclegan_dataset.sh | https://cityscapes-dataset.com | 数据集地址 | -| 开发引入 | / | Pix2Pix/models/pix2pix_model.py | https://gitee.com/ascend/modelzoo/tree/master/contrib/PyTorch/Official/cv/image_classification | 源码实现 | -| 开发引入 | / | Pix2Pix/models/cycle_gan_model.py | https://arxiv.org/pdf/1703.10593.pdf | 论文地址 | -| 开发引入 | / | Pix2Pix/scripts/eval_cityscapes/util.py | https://github.com/shelhamer/clockwork-fcn | 源码实现 | -| 开发引入 | / | Pix2Pix/models/networks.py | https://arxiv.org/abs/1505.04597 | 论文地址 | -| 开发引入 | / | Pix2Pix/datasets/download_pix2pix_dataset.sh | https://cityscapes-dataset.com | 数据集地址 | -| 开发引入 | / | Pix2Pix/scripts/edges/PostprocessHED.m | https://raw.githubusercontent.com/pdollar/edges/master/private/edgesNmsMex.cpp | 源码实现 | -| 开发引入 | / | Pix2Pix/models/networks.py | https://github.com/jcjohnson/fast-neural-style | 源码实现 | -| 开发引入 | / | Pix2Pix/models/networks.py | https://pytorch.org/docs/stable/elastic/run.html | 相关说明 | -| 开发引入 | / | Pix2Pix/train.py | https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/docs/qa.md | 相关说明 | -| 开发引入 | / | Pix2Pix/test.py | https://github.com/junyanz/pytorch-CycleGAN-and-pix2pix/blob/master/docs/tips.md | 相关说明 | -| 开发引入 | / | Pix2Pix/models/base_model.py | https://pytorch.org/docs/stable/elastic/run.html | 相关说明 | -| 开发引入 | / | Pix2Pix/scripts/edges/batch_hed.py | https://github.com/s9xie/hed/blob/master/examples/hed/HED-tutorial.ipynb | 源码实现 | -| 开发引入 | / | Pix2Pix/test.py | https://pytorch.org/docs/stable/elastic/run.html | 相关说明 | -| 开发引入 | / | Pix2Pix/models/pix2pix_model.py | https://phillipi.github.io/pix2pix/ | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/datasets/download_cyclegan_dataset.sh | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/$FILE.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/datasets/download_cyclegan_dataset.sh | https://cityscapes-dataset.com, and use the script ./datasets/prepare_cityscapes_dataset.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/datasets/download_pix2pix_dataset.sh | http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/$FILE.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/datasets/download_pix2pix_dataset.sh | https://cityscapes-dataset.com, and use the script ./datasets/prepare_cityscapes_dataset.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/scripts/download_cyclegan_model.sh | http://efrosgans.eecs.berkeley.edu/cyclegan/pretrained_models/$FILE.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/scripts/download_pix2pix_model.sh | http://efrosgans.eecs.berkeley.edu/pix2pix/models-pytorch/$FILE.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/scripts/eval_cityscapes/download_fcn8s.sh | http://people.eecs.berkeley.edu/~tinghuiz/projects/pix2pix/fcn-8s-cityscapes/fcn-8s-cityscapes.caffemodel | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/util/get_data.py | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2Pix/util/get_data.py | http://efrosgans.eecs.berkeley.edu/pix2pix/datasets/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/Pix2PixHD/run_engine.py | https://wiki.tiker.net/PyCuda/Installation/Linux | 相关说明 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/SRGAN/publisc_address_statement.md b/PyTorch/contrib/cv/others/SRGAN/publisc_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..d8115d9fe8761ef274e2349a3a8b5064d04202b8 --- /dev/null +++ b/PyTorch/contrib/cv/others/SRGAN/publisc_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/SRGAN/train8p.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/public_address_statement.md b/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/public_address_statement.md index 6b268a64b46cedea46d5acb66f6ba8a6a359cd4a..2721197abf750817abf3cc94522b4b0cf4bfc83a 100644 --- a/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/public_address_statement.md @@ -1,29 +1,13 @@ - | 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/inception/download_inception.txt| https://pan.baidu.com/s/1CBiKXaBzS8A0IGxcBivbTA | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/metrics/frechet_inception_distance.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/inception-2015-12-05.pt | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/metrics/inception_score.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/inception-2015-12-05.pt | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/inception-2015-12-05.pt | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/metrics/perceptual_path_length.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/vgg16.pt | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/metrics/precision_recall.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/vgg16.pt | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/projector.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/vgg16.pt | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/ffhq-res256-mirror-paper256-noaug.pkl | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/ffhq-res512-mirror-stylegan2-noaug.pkl | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/ffhq-res1024-mirror-stylegan2-noaug.pkl | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/celebahq-res256-mirror-paper256-kimg100000-ada-target0.5.pkl | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch | stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/lsundog-res256-paper256-kimg100000-noaug.pkl | 下载预训练模型 | - | 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/style_mixing.py | stylegan2-ada-pytorch/generate.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metfaces.pkl | 相关数据 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | https://github.com/mbinkowski/MMD-GAN/blob/master/gan/compute_scores.py | 源码实现 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | stylegan2-ada-pytorch/metrics/inception_score.py | http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz | 数据集地址 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | stylegan2-ada-pytorch/metrics/frechet_inception_distance.py | http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz | 数据集地址 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/frechet_inception_distance.py | stylegan2-ada-pytorch/metrics/frechet_inception_distance.py | https://github.com/bioinf-jku/TTUR/blob/master/fid.py | 源码实现 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/style_mixing.py | stylegan2-ada-pytorch/style_mixing.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metfaces.pkl | 相关数据 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/perceptual_path_length.py | stylegan2-ada-pytorch/metrics/perceptual_path_length.py | https://github.com/NVlabs/stylegan/blob/master/metrics/perceptual_path_length.py | 源码实现 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/inception_score.py | stylegan2-ada-pytorch/metrics/inception_score.py | https://github.com/openai/improved-gan/blob/master/inception_score/model.py | 源码实现 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/README.md | stylegan2-ada-pytorch/projector.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/ffhq.pkl | 相关数据 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz | 数据集地址 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/metrics/precision_recall.py | stylegan2-ada-pytorch/metrics/precision_recall.py | https://github.com/kynkaat/improved-precision-and-recall-metric/blob/master/precision_recall.py | 源码实现 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/README.md | stylegan2-ada-pytorch/generate.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/cifar10.pkl | 相关数据 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/README.md | stylegan2-ada-pytorch/legacy.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2/networks/stylegan2-cat-config-f.pkl | 相关数据 | -| 开源代码引入 | https://github.com/NVlabs/stylegan2-ada-pytorch/README.md | stylegan2-ada-pytorch/calc_metrics.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/ffhq.pkl | 相关数据 | -| 开发引入 | / | stylegan2-ada-pytorch/inception/download_inception.txt | https://pan.baidu.com/s/1CBiKXaBzS8A0IGxcBivbTA | 预训练权重 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/metrics/frechet_inception_distance.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/inception-2015-12-05.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/metrics/inception_score.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/inception-2015-12-05.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/metrics/kernel_inception_distance.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/inception-2015-12-05.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/metrics/perceptual_path_length.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/vgg16.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/metrics/precision_recall.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/vgg16.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/projector.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/metrics/vgg16.pt | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/lsundog-res256-paper256-kimg100000-noaug.pkl | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/ffhq-res512-mirror-stylegan2-noaug.pkl | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/ffhq-res256-mirror-paper256-noaug.pkl | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/ffhq-res1024-mirror-stylegan2-noaug.pkl | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/others/stylegan2-ada-pytorch/train.py | https://nvlabs-fi-cdn.nvidia.com/stylegan2-ada-pytorch/pretrained/transfer-learning-source-nets/celebahq-res256-mirror-paper256-kimg100000-ada-target0.5.pkl | 模型权重 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/pose_estimation/AlphaPose/public_address_statement.md b/PyTorch/contrib/cv/pose_estimation/AlphaPose/public_address_statement.md index 4a7965345ca29b25eced7d8f521448b16651293a..98ee6bf898d29fb53d0f21aab66ef50b01ae073c 100644 --- a/PyTorch/contrib/cv/pose_estimation/AlphaPose/public_address_statement.md +++ b/PyTorch/contrib/cv/pose_estimation/AlphaPose/public_address_statement.md @@ -1,96 +1,18 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------|--------------------------------------------|--------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/setup.cfg | https://pypi.tuna.tsinghua.edu.cn/simple | 配置下载源 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/setup.py | https://github.com/MVIG-SJTU/AlphaPose | 源码地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/setup.py | https://github.com/philferriere/cocoapi.git#subdirectory=PythonAPI | 下载第三方库 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/setup.py | https://github.com/yanfengliu/cython_bbox.git | 下载第三方库 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/osnet.py | https://drive.google.com/uc?id=1LaG1EJpHrxdAxKnSCJ_i0u-nbxSAeiFY | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/osnet.py | https://drive.google.com/uc?id=1uwA9fElHOk3ZogwbeY5GkLI6QPTX70Hq | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/osnet.py | https://drive.google.com/uc?id=16DGLbZukvVYgINws8u8deSaOqjybZ83i | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/osnet.py | https://drive.google.com/uc?id=1rb8UN5ZzPKRc_xvtHlyDh-cSz88YX9hs | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/osnet.py | https://drive.google.com/uc?id=1sr90V6irlYYDd4_4ISU2iruoRG8J__6l | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/osnet_ain.py | https://drive.google.com/uc?id=1-CaioD9NaqbHK_kzSMW8VE4_3KcsRjEo | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose.git | AlphaPose/detector/tracker/models.py | https://pjreddie.com/media/files/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/utils/utils.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/datasets/mpii.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/ResNet.py | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/efficientdet.py | AlphaPose/detector/efficientdet/effdet/efficientdet.py | https://github.com/google/automl/tree/master/efficientdet | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/fastpose_duc_dense.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/DUC.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/alphapose/models/layers/dcn/src/deform_conv_cuda_kernel.cu | AlphaPose/alphapose/models/layers/dcn/src/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/tracking/utils/utils.py | https://storage.googleapis.com/ultralytics/yolov3/results_v1.txt | 相关说明 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/utils.py | AlphaPose/trackers/PoseFlow/utils.py | yuliangxiu@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/efficientdet.py | AlphaPose/detector/efficientdet/effdet/efficientdet.py | https://arxiv.org/abs/1911.09070 | 论文地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/PixelUnshuffle.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/alphapose/models/hrnet.py | AlphaPose/alphapose/models/hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/SE_module.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/parallel_process.py | AlphaPose/trackers/PoseFlow/parallel_process.py | http://danshiebler.com/2016-09-14-parallel-progress-bar/ | 相关说明 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/ResNet.py | AlphaPose/trackers/ReidModels/ResNet.py | thutanghy@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/datasets/coco_det.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/utils/metrics.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/datasets/concat_dataset.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/utils/logger.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/tracker_api.py | AlphaPose/detector/tracker_api.py | fhaoshu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/opt.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/ResNet.py | AlphaPose/trackers/ReidModels/bn_linear.py | thutanghy@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/utils.py | AlphaPose/trackers/PoseFlow/tracker-general.py | yuliangxiu@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/bbox.py | AlphaPose/trackers/utils/bbox.py | https://github.com/Microsoft/human-pose-estimation.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/utils.py | AlphaPose/trackers/PoseFlow/matching.py | yuliangxiu@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/fastpose.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/anchors.py | AlphaPose/detector/efficientdet/effdet/anchors.py | https://github.com/tensorflow/tpu/blob/master/models/official/retinanet/anchors.py | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/simplepose.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/tracking/utils/utils.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/trackers/utils/basetransforms.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/ResNet.py | AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 预训练模型 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/utils/utils.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/detector/tracker/utils/utils.py | https://storage.googleapis.com/ultralytics/yolov3/results_v1.txt | 相关说明 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/utils.py | AlphaPose/trackers/PoseFlow/tracker-baseline.py | yuliangxiu@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/tracker/utils/datasets.py | AlphaPose/detector/tracker/utils/datasets.py | https://medium.com/uruvideo/dataset-augmentation-with-random-homographies-a8f4b44830d4 | 相关说明 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/detector/tracker/utils/utils.py | https://github.com/rafaelpadilla/Object-Detection-Metrics | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/efficientdet.py | AlphaPose/detector/efficientdet/effdet/config/config.py | https://github.com/google/automl/tree/master/efficientdet | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/anchors.py | AlphaPose/detector/efficientdet/effdet/anchors.py | https://github.com/google/automl/blob/6f6694cec1a48cdb33d5d1551a2d5db8ad227798/efficientdet/anchors.py | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/poseflow_infer.py | AlphaPose/detector/apis.py | xuchao.19962007@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/alphapose/models/layers/dcn/src/deform_conv_cuda_kernel.cu | AlphaPose/alphapose/models/layers/dcn/src/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/alphapose/models/layers/dcn/src/deform_pool_cuda_kernel.cu | AlphaPose/alphapose/models/layers/dcn/src/deform_pool_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/cuda/deform_psroi_pooling_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/fastpose_duc.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/detector/tracker/utils/utils.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/SE_Resnet.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/object_detection/__init__.py | AlphaPose/detector/efficientdet/effdet/object_detection/__init__.py | https://github.com/tensorflow/tpu/tree/master/models/official/retinanet | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/tracker/utils/evaluation.py | AlphaPose/detector/tracker/utils/evaluation.py | https://github.com/longcw/py-motmetrics | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/dcn/DCN.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/datasets/custom.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/Resnet.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/datasets/infer.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/resnet_fc.py | AlphaPose/trackers/ReidModels/resnet_fc.py | https://github.com/pytorch/vision | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/utils/utils.py | https://github.com/dbolya/yolact/blob/master/layers/functions/detection.py | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/resnet_fc.py | AlphaPose/trackers/ReidModels/resnet_fc.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/utils/transforms.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/bbox.py | AlphaPose/alphapose/utils/bbox.py | https://github.com/Microsoft/human-pose-estimation.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/datasets/mscoco.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/poseflow_infer.py | AlphaPose/trackers/PoseFlow/poseflow_infer.py | xuchao.19962007@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/transform.py | AlphaPose/trackers/utils/transform.py | https://arxiv.org/pdf/1708.04896.pdf | 论文地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/scripts/demo.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/setup.py | AlphaPose/setup.py | https://github.com/philferriere/cocoapi#subdirectory=PythonAPI | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/ReidModels/ResNet.py | AlphaPose/trackers/ReidModels/ResBnLin.py | thutanghy@gmail.com | 邮箱地址 | -| 开发引入 | / | AlphaPose/detector/efficientdet/effdet/object_detection/faster_rcnn_box_coder.py | http://arxiv.org/abs/1506.01497 | 论文地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/utils/presets/simple_transform.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/utils/env.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/utils/utils.py | https://storage.googleapis.com/ultralytics/yolov3/results_v1.txt | 相关说明 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/PoseFlow/poseflow_infer.py | AlphaPose/detector/yolo_api.py | xuchao.19962007@sjtu.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/layers/ShuffleResnet.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/tracker_api.py | AlphaPose/detector/effdet_api.py | fhaoshu@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/detector/efficientdet/effdet/efficientdet.py | AlphaPose/detector/efficientdet/effdet/bench.py | https://github.com/google/automl/tree/master/efficientdet | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/utils.py | AlphaPose/trackers/tracking/utils/utils.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/MVIG-SJTU/AlphaPose/trackers/utils/basetransforms.py | AlphaPose/alphapose/models/criterion.py | jeff.lee.sjtu@gmail.com | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/detector/tracker/models.py | https://pjreddie.com/media/files/ | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/setup.cfg | https://pypi.tuna.tsinghua.edu.cn/simple | 相关配置 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/ResNet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/AlphaPose/trackers/ReidModels/resnet_fc.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/pose_estimation/DeepPose/public_address_statement.md b/PyTorch/contrib/cv/pose_estimation/DeepPose/public_address_statement.md index 6f7b64c652a58db036a134991de3e3f19b36d31b..6d41a1b0b04fad7ec175794d770466b16731d0ca 100644 --- a/PyTorch/contrib/cv/pose_estimation/DeepPose/public_address_statement.md +++ b/PyTorch/contrib/cv/pose_estimation/DeepPose/public_address_statement.md @@ -1,151 +1,78 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|-----------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 下载预训练模型 | -| 开发引入 | / | DeepPose/mmpose/core/post_processing/one_euro_filter.py | https://github.com/HoBeom/OneEuroFilter-Numpy | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/upfirdn2d.py | https://www.mathworks.com/help/signal/ref/upfirdn.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/README.md | DeepPose/mmpose/core/post_processing/nms.py | https://github.com/leoxiaobin/deep-high-resolution-net.pytorch | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现 | -| 开发引入 | / | DeepPose/mmpose/core/post_processing/group.py | https://github.com/princeton-vl/pose-ae-train/ | 源码实现 | -| 开发引入 | / | DeepPose/modelarts/train_start.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 相关说明 | -| 开发引入 | / | DeepPose/mmpose/datasets/pipelines/mesh_transform.py | https://smpl.is.tue.mpg.de/ | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/ops/focal_loss.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_act.py | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开发引入 | / | DeepPose/mmpose/deprecated.py | https://github.com/open-mmlab/mmpose/pull/202 | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址 | -| 开发引入 | / | DeepPose/train.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/utils/registry.py | https://mmcv.readthedocs.io/en/latest/registry.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/datasets/datasets/body/coco_dataset.py | DeepPose/mmpose/datasets/datasets/top_down/topdown_coco_dataset.py | https://arxiv.org/abs/1405.0312 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/datasets/builder.py | DeepPose/mmpose/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn","sjqian@cse.cuhk.edu.hk","yuliu@ee.cuhk.edu.hk | 邮箱地址 | -| 开发引入 | / | DeepPose/mmcv/ops/fused_bias_leakyrelu.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_act.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/docs/zh_cn/notes/changelog.md | DeepPose/mmpose/models/detectors/top_down.py | https://github.com/open-mmlab/mmpose/pull/382 | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch/blob/75ee5756715e7161314ce037474843b68f69fc04/torch/onnx/symbolic_helper.py#L375 | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 源码实现 | -| 开发引入 | / | DeepPose/mmpose/datasets/pipelines/shared_transform.py | https://albumentations.readthedocs.io | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/evaluation/functional/mesh_eval.py | DeepPose/mmpose/models/mesh_heads/discriminator.py | https://github.com/akanazawa/hmr | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/utils/hooks.py | DeepPose/mmpose/utils/hooks.py | https://stackoverflow.com/questions/31174295/getattr-and-setattr-on-nested-objects | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/transformer.py | https://arxiv.org/pdf/2010.04159.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/README.md | DeepPose/mmpose/core/post_processing/post_transforms.py | https://github.com/leoxiaobin/deep-high-resolution-net.pytorch | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/ops/deprecated_wrappers.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_act.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/models/backbones/resnet.py | DeepPose/mmpose/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/ops/roi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | DeepPose/mmpose/datasets/datasets/top_down/topdown_coco_dataset.py | https://github.com/cocodataset/cocoapi/blob/master/PythonAPI/ | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/ops/nms.py | https://github.com/pytorch/vision/blob | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/ops/saconv.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/evaluation/functional/mesh_eval.py | DeepPose/mmpose/core/evaluation/mesh_eval.py | https://github.com/akanazawa/hmr | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/datasets/transforms/common_transforms.py | DeepPose/mmpose/datasets/pipelines/shared_transform.py | https://github.com/albumentations-team/ | 源码实现 | -| 开发引入 | / | DeepPose/mmpose/core/post_processing/one_euro_filter.py | http://gvv.mpi-inf.mpg.de/projects/VNect/ | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现 | -| 开发引入 | / | DeepPose/mmpose/models/backbones/resnet.py | https://arxiv.org/abs/1512.03385 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/onnx/symbolic.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/ops/roi_align_rotated.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose.git/mmpose/models/losses/heatmap_loss.py | DeepPose/mmcv/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/image/photometric.py | https://dl.acm.org/doi/pdf/10.1145/3065386 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/engine/test.py | https://github.com/open-mmlab/mmcv/issues/985 | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/runner/hooks/profiler.py | https://pytorch.org/docs/1.8.1/profiler.html#torch.profiler.profile | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/ops/fused_bias_leakyrelu.py | http://arxiv.org/abs/1912.04958 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/ops/upfirdn2d.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/upfirdn2d.py | 源码实现 | -| 开发引入 | / | DeepPose/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 相关说明 | -| 开发引入 | / | DeepPose/mmcv/ops/deform_conv.py | https://arxiv.org/pdf/1703.06211.pdf | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | DeepPose/mmcv/cnn/bricks/transformer.py | https://arxiv.org/abs/2002.04745 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | mmedit/mobilenet_v2": "https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/DeepPose/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/pose_estimation/HigherHRNet/public_address_statement.md b/PyTorch/contrib/cv/pose_estimation/HigherHRNet/public_address_statement.md index d8d1bc9590b4d69b1a07e636c089622ae4cedd16..f71874f1b0acf65764e2b4e0e2f2786ab729dae2 100644 --- a/PyTorch/contrib/cv/pose_estimation/HigherHRNet/public_address_statement.md +++ b/PyTorch/contrib/cv/pose_estimation/HigherHRNet/public_address_statement.md @@ -1,4 +1,57 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | HigherHRNet/tools/_init_paths.py | leoxiaobin@gmail.com | 邮箱地址 | -| 开发引入 | / | HigherHRNet/lib/dataset/COCODataset.py | http://mscoco.org/dataset/#detections-challenge2016 | 数据集地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/.github/workflows/build.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/.github/workflows/build.yml | https://developer.download.nvidia.com/compute/cuda/repos/${UBUNTU_VERSION}/x86_64/7fa2af80.pub | 下载秘钥 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/.github/workflows/build.yml | http://developer.download.nvidia.com/compute/cuda/repos/${UBUNTU_VERSION}/x86_64/${INSTALLER} | 下载依赖 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/.github/workflows/build.yml | https://download.openmmlab.com/mmcv/dist/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/higherhrnet/coco/higher_hrnet32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/higherhrnet/coco/higher_hrnet32_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/higherhrnet/coco/higher_hrnet48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/higherhrnet/crowdpose/higher_hrnet32_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/higherhrnet/crowdpose/higher_hrnet32_crowdpose_640x640.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/higherhrnet/crowdpose/higher_hrnet48_crowdpose_512x512.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/hrnet/coco/hrnet_w32_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/hrnet/coco/hrnet_w32_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/hrnet/coco/hrnet_w48_coco_512x512.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/bottom_up/hrnet/coco/hrnet_w48_coco_640x640.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco/hrnet_w32_coco_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco/hrnet_w32_coco_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco/hrnet_w48_coco_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco/hrnet_w48_coco_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco-wholebody/hrnet_w32_coco_wholebody_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco-wholebody/hrnet_w32_coco_wholebody_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco-wholebody/hrnet_w48_coco_wholebody_256x192_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco-wholebody/hrnet_w48_coco_wholebody_384x288_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/coco-wholebody/hrnet_w48_coco_wholebody_384x288_dark_plus.py | https://download.openmmlab.com/mmpose/top_down/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/mpii/hrnet_w32_mpii_256x256_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/darkpose/mpii/hrnet_w48_mpii_256x256_dark.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/aic/hrnet_w32_aic_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/aic/hrnet_w32_aic_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/aic/hrnet_w48_aic_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/aic/hrnet_w48_aic_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco/hrnet_w32_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco/hrnet_w32_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco/hrnet_w48_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco/hrnet_w48_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco-wholebody/hrnet_w32_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco-wholebody/hrnet_w32_coco_wholebody_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco-wholebody/hrnet_w48_coco_wholebody_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/coco-wholebody/hrnet_w48_coco_wholebody_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/crowdpose/hrnet_w32_crowdpose_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/crowdpose/hrnet_w32_crowdpose_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/crowdpose/hrnet_w48_crowdpose_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/crowdpose/hrnet_w48_crowdpose_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/mpii/hrnet_w32_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/mpii/hrnet_w48_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/hrnet/posetrack18/hrnet_w32_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/hrnet/hrnet_w32_coco_256x192-c78dce93_20200708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/resnet/posetrack18/res50_posetrack18_256x192.py | https://download.openmmlab.com/mmpose/top_down/resnet/res50_coco_256x192-ec54d7f3_20200709.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/scnet/coco/scnet101_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/scnet/coco/scnet101_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/scnet/coco/scnet50_coco_256x192.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/scnet/coco/scnet50_coco_384x288.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/scnet/mpii/scnet101_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/configs/top_down/scnet/mpii/scnet50_mpii_256x256.py | https://download.openmmlab.com/mmpose/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Hourglass_for_PyTorch/mmpose-master/setup.py | openmmlab@gmail.com | maintainer邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/pose_estimation/Lightweight_OpenPose/public_address_statement.md b/PyTorch/contrib/cv/pose_estimation/Lightweight_OpenPose/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..fac60912f843bfe8bc5d439b385f9bffdf917b36 --- /dev/null +++ b/PyTorch/contrib/cv/pose_estimation/Lightweight_OpenPose/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/Lightweight_OpenPose/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/pose_estimation/MIPNet/public_address_statement.md b/PyTorch/contrib/cv/pose_estimation/MIPNet/public_address_statement.md index 9a15f4799f54f9b0340ae92c967cfafa85bb74fe..3b85107bbc2bd31dc6d10f148d0355cd22739143 100644 --- a/PyTorch/contrib/cv/pose_estimation/MIPNet/public_address_statement.md +++ b/PyTorch/contrib/cv/pose_estimation/MIPNet/public_address_statement.md @@ -1,76 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------|----------------------------|-------------------------------------------------------------------|------| -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git | MIPNet/lib/utils/server.py | https://code.jquery.com/jquery-1.10.2.min.js | 下载依赖 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git | MIPNet/lib/utils/server.py | http://luis-almeida.github.io/unveil/jquery.unveil.min.js | 下载依赖 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git | MIPNet/scripts/install.sh | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI | 下载依赖 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/utils/zipreader.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_resnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | MIPNet/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/cocoeval.py | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/_init_paths.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/nms/setup_linux.py | MIPNet/lib/nms/setup_linux.py | http://code.activestate.com/recipes/52224-find-a-file-given-a-search-path/ | 相关说明 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/evaluate.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/JointsLambdaDataset.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/validate.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/utils/vis.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/config/__init__.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/function_cutmix.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/utils/transforms_np.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_hrnet_multi_task_lambda.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/utils/utils.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/test.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/nms/gpu_nms.pyx | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/utils/vis_coco.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_resnet_se.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/__init__.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/JointsDataset.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/validate_general.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/visualize_lambda.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/nms/nms_kernel.cu | MIPNet/lib/nms/gpu_nms.cu | https://github.com/shaoqingren/faster_rcnn | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/coco.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/scripts/install.sh | MIPNet/scripts/install.sh | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/scripts/install.sh | MIPNet/scripts/install.sh | https://github.com/cocodataset/cocoapi/pull/354 | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/__init__.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/loss.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/function.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/train.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/lambda/train_lambda_real.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/ochuman.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/nms/setup_linux.py | MIPNet/lib/nms/setup_linux.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_hrnet_se_lambda_visualize.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开发引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/dataset/crowdpose_lambda.py | MIPNet/lib/dataset/crowdpose_lambda.py | bcheng9@illinois.edu | 邮箱地址 | -| 开发引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/dataset/crowdpose_lambda.py | MIPNet/lib/dataset/crowdpose_lambda.py | leoxiaobin@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | MIPNet/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/coco.py | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/inference.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/config/default.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/train.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/nms/nms_kernel.cu | MIPNet/lib/nms/nms_kernel.cu | https://github.com/shaoqingren/faster_rcnn | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/train_general_sequential.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/lambda_general/train_lambda_real_012.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/nms/cpu_nms.pyx | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_resnet_se_lambda.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/JointsLambdaGeneralDataset.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | MIPNet/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/config/models.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/lambda/_init_paths.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_hrnet_se.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开发引入 | / | MIPNet/lib/crowdpose-api/crowdpose-api/common/gason.cpp | https://github.com/vivkin/gason | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_hrnet.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开发引入 | / | MIPNet/lib/dataset/crowdpose.py | bcheng9@illinois.edu","leoxiaobin@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/core/inference.py | MIPNet/lib/core/inference.py | https://github.com/ilovepose/DarkPose/blob/master/lib/core/inference.py | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/coco_lambda.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/function_random_segmix.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/lambda_general/train_lambda_real_0123.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/function_segmix.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/coco_lambda_0123.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开发引入 | / | MIPNet/lib/crowdpose-api/crowdpose-api/common/gason.h | https://github.com/vivkin/gason | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/train_general.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/lambda_general/_init_paths.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | MIPNet/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/mask.py | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/utils/transforms.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | MIPNet/lib/crowdpose-api/crowdpose-api/common/maskApi.c | http://mscoco.org/ | 相关说明 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/core/function_cutout.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/dataset/coco_lambda_012.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/tools/train_ddp.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/nms/setup_linux.py | MIPNet/lib/nms/nms.py | https://github.com/rbgirshick/py-faster-rcnn | 源码实现 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/tools/_init_paths.py | MIPNet/lib/models/pose_hrnet_se_lambda.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/rawalkhirodkar/MIPNet.git/lib/crowdpose-api/crowdpose-api/PythonAPI/crowdposetools/_mask.pyx | MIPNet/lib/crowdpose-api/crowdpose-api/common/maskApi.h | http://mscoco.org/ | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|----------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/MIPNet/lib/utils/server.py | https://code.jquery.com/jquery-1.10.2.min.js | 下载依赖 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/pose_estimation/ST-GCN/public_address_statement.md b/PyTorch/contrib/cv/pose_estimation/ST-GCN/public_address_statement.md index 3c572d184a7792fefdf82efb099c4b8762c3bad2..5fec7b25ae9b4fcf6cdcabc0f92e5644359d9495 100644 --- a/PyTorch/contrib/cv/pose_estimation/ST-GCN/public_address_statement.md +++ b/PyTorch/contrib/cv/pose_estimation/ST-GCN/public_address_statement.md @@ -1,11 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-----------------------------------------------------------------------------------|----------------------------|-----------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/deprecated/origin_stgcn_repo | ST-GCN/tools/get_models.sh | https://s3-us-west-1.amazonaws.com/yysijie-data/public/st-gcn/models/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/deprecated/origin_stgcn_repo | ST-GCN/tools/get_models.sh | http://posefs1.perception.cs.cmu.edu/OpenPose/models/ | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/mmskeleton/ops/st_gcn/graph.py | ST-GCN/net/utils/graph.py | https://arxiv.org/abs/1801.07455 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/mmskeleton/ops/st_gcn/graph.py | ST-GCN/net/onnx_net/utils/graph.py | https://github.com/shahroudy/NTURGB-D | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/mmskeleton/ops/st_gcn/graph.py | ST-GCN/net/onnx_net/utils/graph.py | https://arxiv.org/abs/1801.07455 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/mmskeleton/ops/st_gcn/graph.py | ST-GCN/net/utils/graph.py | https://github.com/shahroudy/NTURGB-D | 源码实现 | -| 开发引入 | / | ST-GCN/demo.py | https://www.youtube.com/watch?v=--6bJUbfpnQ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/mmskeleton/ops/st_gcn/graph.py | ST-GCN/net/onnx_net/utils/graph.py | https://github.com/CMU-Perceptual-Computing-Lab/openpose#output | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmskeleton/tree/master/mmskeleton/ops/st_gcn/graph.py | ST-GCN/net/utils/graph.py | https://github.com/CMU-Perceptual-Computing-Lab/openpose#output | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------|-----------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/ST-GCN/tools/get_models.sh | https://s3-us-west-1.amazonaws.com/yysijie-data/public/st-gcn/models/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/pose_estimation/ST-GCN/tools/get_models.sh | http://posefs1.perception.cs.cmu.edu/OpenPose/models/ | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/3DUNet/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/3DUNet/public_address_statement.md index 22124194bbaaacb5b233cb374df9f95338033848..9c13bef2ec61b0bf8846521ae992bf27ebc8dfcd 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/3DUNet/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/3DUNet/public_address_statement.md @@ -1,46 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------------|-----------------------------------------------|----------------------------------------------------------------------------------------------|-------| -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch | 3DUNet/docker/Dockerfile | black.adaloglou@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch | 3DUNet/tests/test_medical_3D_augemt.py | https://nipy.org/nibabel/_downloads/c16214e490de2a223655d30f4ba78f15/someones_anatomy.nii.gz | 下载数据集 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch | 3DUNet/tests/test_medical_3D_augemt.py | https://nipy.org/nibabel/_downloads/f76cc5a46e5368e2c779868abc49e497/someones_epi.nii.gz | 下载数据集 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch | 3DUNet/tests/test_medical_3D_preprocessing.py | https://nipy.org/nibabel/_downloads/c16214e490de2a223655d30f4ba78f15/someones_anatomy.nii.gz | 下载数据集 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch | 3DUNet/tests/test_medical_3D_preprocessing.py | https://nipy.org/nibabel/_downloads/f76cc5a46e5368e2c779868abc49e497/someones_epi.nii.gz | 下载数据集 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/Unet3D.cpython-36.pyc | 3DUNet/Unet3D.py | https://arxiv.org/abs/1606.06650 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/dice.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/pixel_wise_cross_entropy.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/HighResNet3D.py | 3DUNet/lib/medzoo/HighResNet3D.py | https://arxiv.org/pdf/1707.01992.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/DenseVoxelNet.cpython-36.pyc | 3DUNet/lib/medzoo/DenseVoxelNet.py | https://arxiv.org/abs/1708.00573 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/weight_smooth_l1.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/__init__.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medloaders/miccai_2019_pathology.py | 3DUNet/lib/medloaders/miccai_2019_pathology.py | https://github.com/black0017/MICCAI-2019-Prostate-Cancer-segmentation-challenge | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/Vnet.cpython-36.pyc | 3DUNet/lib/losses3D/basic.py | https://arxiv.org/abs/1606.04797 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/Densenet3D.py | 3DUNet/lib/medzoo/Densenet3D.py | https://arxiv.org/pdf/1804.02967.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/VAEloss.py | 3DUNet/lib/losses3D/VAEloss.py | https://arxiv.org/abs/1312.6114 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/Vnet.py | 3DUNet/lib/medzoo/Vnet.py | https://github.com/Dawn90/V-Net.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/Vnet.cpython-36.pyc | 3DUNet/lib/losses3D/dice.py | https://arxiv.org/abs/1606.04797 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__pycache__/ContrastiveLoss.cpython-36.pyc | 3DUNet/lib/losses3D/ContrastiveLoss.py | https://arxiv.org/pdf/1708.02551.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/examples/test_miccai_2019.py | 3DUNet/examples/test_miccai_2019.py | https://gleason2019.grand-challenge.org/ | 相关说明 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/BaseClass.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/basic.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/Unet3D.cpython-36.pyc | 3DUNet/lib/medzoo/Unet3D.py | https://arxiv.org/abs/1606.06650 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/Unet2D.py | 3DUNet/lib/medzoo/Unet2D.py | https://github.com/milesial/Pytorch-UNet/blob/master/unet/unet_model.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/generalized_dice.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/ResNet3D_VAE.py | 3DUNet/lib/medzoo/ResNet3D_VAE.py | https://arxiv.org/pdf/1810.11654.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__pycache__/weight_cross_entropy.cpython-36.pyc | 3DUNet/lib/losses3D/generalized_dice.py | https://arxiv.org/pdf/1707.03237.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__pycache__/weight_cross_entropy.cpython-36.pyc | 3DUNet/lib/losses3D/weight_cross_entropy.py | https://arxiv.org/pdf/1707.03237.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/tags_angular_loss.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/weight_cross_entropy.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medloaders/__pycache__/ixi_t1_t2.cpython-36.pyc | 3DUNet/lib/medloaders/ixi_t1_t2.py | https://www.slicer.org/wiki/Coordinate_systems | 相关说明 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/SkipDenseNet3D.cpython-36.pyc | 3DUNet/lib/medzoo/SkipDenseNet3D.py | https://github.com/tbuikr/3D-SkipDenseSeg | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/BCE_dice.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/ResNet3DMedNet.py | 3DUNet/lib/medzoo/ResNet3DMedNet.py | https://github.com/kenshohara/3D-ResNets-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/augment3D/elastic_deform.py | 3DUNet/lib/augment3D/elastic_deform.py | https://github.com/fcalvet/image_tools/blob/master/image_augmentation.py#L62 | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/augment3D/elastic_deform.py | 3DUNet/lib/augment3D/elastic_deform.py | https://gist.github.com/chsasank/4d8f68caf01f041a6453e67fb30f8f5a | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/Vnet.cpython-36.pyc | 3DUNet/lib/medzoo/Vnet.py | https://arxiv.org/abs/1606.04797 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/ResNet3DMedNet.py | 3DUNet/lib/medzoo/ResNet3DMedNet.py | https://arxiv.org/abs/1904.00625 | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/SkipDenseNet3D.cpython-36.pyc | 3DUNet/lib/medzoo/SkipDenseNet3D.py | https://arxiv.org/pdf/1709.03199.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/losses3D/__init__.py | 3DUNet/lib/losses3D/ContrastiveLoss.py | https://github.com/wolny/pytorch-3dunet/blob/master/pytorch3dunet/unet3d/losses.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/SkipDenseNet3D.cpython-36.pyc | 3DUNet/lib/medzoo/SkipDenseNet3D.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/visual3D_temp/viz_2d.py | 3DUNet/lib/visual3D_temp/viz_2d.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Fold | 相关说明 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/__pycache__/BaseModelClass.cpython-36.pyc | 3DUNet/lib/medzoo/BaseModelClass.py | https://github.com/kwotsin/mimicry/blob/master/torch_mimicry/nets/basemodel/basemodel.py | 源码实现 | -| 开源代码引入 | https://github.com/black0017/MedicalZooPytorch/lib/medzoo/HyperDensenet.py | 3DUNet/lib/medzoo/HyperDensenet.py | https://github.com/josedolz/HyperDenseNet_pytorch | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------|---------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/3DUNet/docker/Dockerfile | black.adaloglou@gmail.com | maintainer邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/public_address_statement.md index 09f953de1b2de806c5067391d7300a249a340c72..ae617a2e091b2507abe37d23f3c41407ad16bf87 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/public_address_statement.md @@ -1,200 +1,92 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------|--------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://arxiv.org/abs/1808.00897 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.18.0/mmseg/models/backbones/bisenetv1.py#L266 | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://github.com/ycszen/TorchSeg/tree/master/model/bisenet | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes_20210922_172239-c55e78e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210905_220251-8ba80eff.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes_20210905_220322-bb8db75f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes_20210923_222639-7b28a2a6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210917_234628-8b304447.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211022_054328-046aa2f2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211023_013100-f700dbf7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_040616-d2bb0df4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_181932-66747911.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211102_164147-c6b32c3b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_225220-28c8f092.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载预训练模型 | -| 开发引入 | / | 3DUNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 下载预训练模型 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/setup.py | openmmlab@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation | BiSeNetV1/setup.py | http://github.com/open-mmlab/mmsegmentation | 源码地址 | -| 开发引入 | / | BiSeNetV1/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/gc_head.py | BiSeNetV1/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/enc_head.py | BiSeNetV1/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/nl_head.py | BiSeNetV1/mmcv_replace/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/psamask.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/nl_head.py | BiSeNetV1/mmcv_replace/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/roi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/box_iou_rotated_utils.hpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/saconv.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmseg/models/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/nms_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/fpn_head.py | BiSeNetV1/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn","sjqian@cse.cuhk.edu.hk","yuliu@ee.cuhk.edu.hk | 邮箱地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/lraspp_head.py | BiSeNetV1/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/uper_head.py | BiSeNetV1/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/fcn_head.py | BiSeNetV1/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/setup.py | BiSeNetV1/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/modulated_deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2613 | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/nl_head.py | BiSeNetV1/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/aspp_head.py | BiSeNetV1/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/ocr_head.py | BiSeNetV1/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/ema_head.py | BiSeNetV1/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/lraspp_head.py | BiSeNetV1/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/necks/fpn.py | BiSeNetV1/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/box_iou_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/nms_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/gc_head.py | BiSeNetV1/mmcv_replace/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/ann_head.py | BiSeNetV1/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/cc_head.py | BiSeNetV1/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/apc_head.py | BiSeNetV1/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/backbones/hrnet.py | BiSeNetV1/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/utils/registry.py | @backbones.regi | 邮箱地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/cc_attention_cuda.cu | https://github.com/LikeLy-Journey/SegmenTron/blob/master/segmentron/modules/csrc/criss_cross_attention/ca_cuda.cu | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/nms.py | https://github.com/pytorch/vision/blob | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/docs/zh_cn/user_guides/useful_tools.md | BiSeNetV1/mmcv_replace/cnn/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/symbolic.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/projects/hssn/decode_head/sep_aspp_contrast_head.py | BiSeNetV1/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/dnl_head.py | BiSeNetV1/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/backbones/cgnet.py | BiSeNetV1/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/psp_head.py | BiSeNetV1/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/docs/zh_cn/model_zoo.md | BiSeNetV1/mmseg/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/runner/checkpoint.py | @CheckpointLoader.regi | 邮箱地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch/blob/75ee5756715e7161314ce037474843b68f69fc04/torch/onnx/symbolic_helper.py#L375 | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/dm_head.py | BiSeNetV1/mmseg/models/decode_heads/dm_head.py | https://openaccess.thecvf.com/content_ICCV_2019/papers/ | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/da_head.py | BiSeNetV1/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/pytorch/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2417 | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/simplify/core.py | https://github.com/daquexian/onnx-simplifier | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/carafe_cuda_kernel.cuh | https://devblogs.nvidia.com/efficient-matrix-transpose-cuda-cc/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/mmseg/models/decode_heads/psa_head.py | BiSeNetV1/mmseg/models/decode_heads/psa_head.py | https://hszhao.github.io/papers/eccv18_psanet.pdf | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/utils/registry.py | @x.regi | 邮箱地址 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/modulated_deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/blob/e5e9a539f550f07ec156812484e8d4f33fb91f88/onnx/onnx.proto#L461 | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 相关说明 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/ops/csrc/parrots/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开发引入 | / | BiSeNetV1/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://arxiv.org/abs/1808.00897 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes_20210922_172239-c55e78e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210905_220251-8ba80eff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes_20210905_220322-bb8db75f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes_20210923_222639-7b28a2a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210917_234628-8b304447.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211022_054328-046aa2f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211023_013100-f700dbf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_040616-d2bb0df4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_181932-66747911.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211102_164147-c6b32c3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_225220-28c8f092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | mmedit/mobilenet_v2": "https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/BiSeNetV1/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/public_address_statement.md index 5abc20951e9bce555d90742e382273e3b520e8e4..b8012848d119650b8db7f07794545bfdd0b72616 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/public_address_statement.md @@ -1,461 +1,391 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/_base_/models/setr_mla.py | DeeplabV3_for_Pytorch/configs/_base_/models/setr_mla.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/_base_/models/setr_naive.py | DeeplabV3_for_Pytorch/configs/_base_/models/setr_naive.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/_base_/models/setr_pup.py | DeeplabV3_for_Pytorch/configs/_base_/models/setr_pup.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/_base_/models/upernet_vit-b16_ln_mln.py | DeeplabV3_for_Pytorch/configs/_base_/models/upernet_vit-b16_ln_mln.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_40k_cityscapes/ann_r50-d8_512x1024_40k_cityscapes_20200605_095211-049fc292.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_40k_cityscapes/ann_r101-d8_512x1024_40k_cityscapes_20200605_095243-adf6eece.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_40k_cityscapes/ann_r50-d8_769x769_40k_cityscapes_20200530_025712-2b46b04d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_40k_cityscapes/ann_r101-d8_769x769_40k_cityscapes_20200530_025720-059bff28.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_80k_cityscapes/ann_r50-d8_512x1024_80k_cityscapes_20200607_101911-5a9ad545.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_80k_cityscapes/ann_r101-d8_512x1024_80k_cityscapes_20200607_013728-aceccc6e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_80k_cityscapes/ann_r50-d8_769x769_80k_cityscapes_20200607_044426-cc7ff323.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_80k_cityscapes/ann_r101-d8_769x769_80k_cityscapes_20200607_013713-a9d4be8d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_80k_ade20k/ann_r50-d8_512x512_80k_ade20k_20200615_014818-26f75e11.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_80k_ade20k/ann_r101-d8_512x512_80k_ade20k_20200615_014818-c0153543.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_160k_ade20k/ann_r101-d8_512x512_160k_ade20k_20200615_231733-955eb1ec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_20k_voc12aug/ann_r50-d8_512x512_20k_voc12aug_20200617_222246-dfcb1c62.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_20k_voc12aug/ann_r101-d8_512x512_20k_voc12aug_20200617_222246-2fad0042.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_40k_voc12aug/ann_r50-d8_512x512_40k_voc12aug_20200613_231314-b5dac322.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ann/metafile.yml | DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_40k_voc12aug/ann_r101-d8_512x512_40k_voc12aug_20200613_231314-bd205bbe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_40k_cityscapes/apcnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_40k_cityscapes/apcnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_40k_cityscapes/apcnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_40k_cityscapes/apcnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_80k_cityscapes/apcnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_80k_cityscapes/apcnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_80k_cityscapes/apcnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_80k_cityscapes/apcnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_80k_ade20k/apcnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_80k_ade20k/apcnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_160k_ade20k/apcnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/apcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_160k_ade20k/apcnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_40k_cityscapes/ccnet_r50-d8_512x1024_40k_cityscapes_20200616_142517-4123f401.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_40k_cityscapes/ccnet_r101-d8_512x1024_40k_cityscapes_20200616_142540-a3b84ba6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_40k_cityscapes/ccnet_r50-d8_769x769_40k_cityscapes_20200616_145125-76d11884.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_40k_cityscapes/ccnet_r101-d8_769x769_40k_cityscapes_20200617_101428-4f57c8d0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_80k_cityscapes/ccnet_r50-d8_512x1024_80k_cityscapes_20200617_010421-869a3423.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_80k_cityscapes/ccnet_r101-d8_512x1024_80k_cityscapes_20200617_203935-ffae8917.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_80k_cityscapes/ccnet_r50-d8_769x769_80k_cityscapes_20200617_010421-73eed8ca.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_80k_cityscapes/ccnet_r101-d8_769x769_80k_cityscapes_20200618_011502-ad3cd481.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_80k_ade20k/ccnet_r50-d8_512x512_80k_ade20k_20200615_014848-aa37f61e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_80k_ade20k/ccnet_r101-d8_512x512_80k_ade20k_20200615_014848-1f4929a3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_160k_ade20k/ccnet_r50-d8_512x512_160k_ade20k_20200616_084435-7c97193b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_160k_ade20k/ccnet_r101-d8_512x512_160k_ade20k_20200616_000644-e849e007.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_20k_voc12aug/ccnet_r50-d8_512x512_20k_voc12aug_20200617_193212-fad81784.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_20k_voc12aug/ccnet_r101-d8_512x512_20k_voc12aug_20200617_193212-0007b61d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_40k_voc12aug/ccnet_r50-d8_512x512_40k_voc12aug_20200613_232127-c2a15f02.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_40k_voc12aug/ccnet_r101-d8_512x512_40k_voc12aug_20200613_232127-c30da577.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/cgnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_680x680_60k_cityscapes/cgnet_680x680_60k_cityscapes_20201101_110253-4c0b2f2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ccnet/metafile.yml | DeeplabV3_for_Pytorch/configs/cgnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_512x1024_60k_cityscapes/cgnet_512x1024_60k_cityscapes_20201101_110254-124ea03b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_40k_cityscapes/danet_r50-d8_512x1024_40k_cityscapes_20200605_191324-c0dbfa5f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_40k_cityscapes/danet_r101-d8_512x1024_40k_cityscapes_20200605_200831-c57a7157.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_40k_cityscapes/danet_r50-d8_769x769_40k_cityscapes_20200530_025703-76681c60.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_40k_cityscapes/danet_r101-d8_769x769_40k_cityscapes_20200530_025717-dcb7fd4e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_80k_cityscapes/danet_r50-d8_512x1024_80k_cityscapes_20200607_133029-2bfa2293.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_80k_cityscapes/danet_r101-d8_512x1024_80k_cityscapes_20200607_132918-955e6350.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_80k_cityscapes/danet_r50-d8_769x769_80k_cityscapes_20200607_132954-495689b4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_80k_cityscapes/danet_r101-d8_769x769_80k_cityscapes_20200607_132918-f3a929e7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_80k_ade20k/danet_r50-d8_512x512_80k_ade20k_20200615_015125-edb18e08.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_80k_ade20k/danet_r101-d8_512x512_80k_ade20k_20200615_015126-d0357c73.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_160k_ade20k/danet_r50-d8_512x512_160k_ade20k_20200616_082340-9cb35dcd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_160k_ade20k/danet_r101-d8_512x512_160k_ade20k_20200616_082348-23bf12f9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_20k_voc12aug/danet_r50-d8_512x512_20k_voc12aug_20200618_070026-9e9e3ab3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_20k_voc12aug/danet_r101-d8_512x512_20k_voc12aug_20200618_070026-d48d23b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_40k_voc12aug/danet_r50-d8_512x512_40k_voc12aug_20200613_235526-426e3a64.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/danet/metafile.yml | DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_40k_voc12aug/danet_r101-d8_512x512_40k_voc12aug_20200613_223031-788e232a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_40k_cityscapes/deeplabv3_r50-d8_512x1024_40k_cityscapes_20200605_022449-acadc2f8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_40k_cityscapes/deeplabv3_r101-d8_512x1024_40k_cityscapes_20200605_012241-7fd3f799.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_40k_cityscapes/deeplabv3_r50-d8_769x769_40k_cityscapes_20200606_113723-7eda553c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_40k_cityscapes/deeplabv3_r101-d8_769x769_40k_cityscapes_20200606_113809-c64f889f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_512x1024_80k_cityscapes/deeplabv3_r18-d8_512x1024_80k_cityscapes_20201225_021506-23dffbe2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_80k_cityscapes/deeplabv3_r50-d8_512x1024_80k_cityscapes_20200606_113404-b92cfdd4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_80k_cityscapes/deeplabv3_r101-d8_512x1024_80k_cityscapes_20200606_113503-9e428899.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_769x769_80k_cityscapes/deeplabv3_r18-d8_769x769_80k_cityscapes_20201225_021506-6452126a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_80k_cityscapes/deeplabv3_r50-d8_769x769_80k_cityscapes_20200606_221338-788d6228.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_80k_cityscapes/deeplabv3_r101-d8_769x769_80k_cityscapes_20200607_013353-60e95418.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d16-mg124_512x1024_40k_cityscapes/deeplabv3_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-67b0c992.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-57bb8425.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_512x1024_80k_cityscapes/deeplabv3_r18b-d8_512x1024_80k_cityscapes_20201225_094144-46040cef.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_512x1024_80k_cityscapes/deeplabv3_r50b-d8_512x1024_80k_cityscapes_20201225_155148-ec368954.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_512x1024_80k_cityscapes/deeplabv3_r101b-d8_512x1024_80k_cityscapes_20201226_171821-8fd49503.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_769x769_80k_cityscapes/deeplabv3_r18b-d8_769x769_80k_cityscapes_20201225_094144-fdc985d9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_769x769_80k_cityscapes/deeplabv3_r50b-d8_769x769_80k_cityscapes_20201225_155404-87fb0cf4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_769x769_80k_cityscapes/deeplabv3_r101b-d8_769x769_80k_cityscapes_20201226_190843-9142ee57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_80k_ade20k/deeplabv3_r50-d8_512x512_80k_ade20k_20200614_185028-0bb3f844.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_80k_ade20k/deeplabv3_r101-d8_512x512_80k_ade20k_20200615_021256-d89c7fa4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_160k_ade20k/deeplabv3_r50-d8_512x512_160k_ade20k_20200615_123227-5d0ee427.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_160k_ade20k/deeplabv3_r101-d8_512x512_160k_ade20k_20200615_105816-b1f72b3b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_20k_voc12aug/deeplabv3_r50-d8_512x512_20k_voc12aug_20200617_010906-596905ef.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_20k_voc12aug/deeplabv3_r101-d8_512x512_20k_voc12aug_20200617_010932-8d13832f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_40k_voc12aug/deeplabv3_r50-d8_512x512_40k_voc12aug_20200613_161546-2ae96e7e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_40k_voc12aug/deeplabv3_r101-d8_512x512_40k_voc12aug_20200613_161432-0017d784.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context/deeplabv3_r101-d8_480x480_40k_pascal_context_20200911_204118-1aa27336.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context/deeplabv3_r101-d8_480x480_80k_pascal_context_20200911_170155-2a21fff3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context_59/deeplabv3_r101-d8_480x480_40k_pascal_context_59_20210416_110332-cb08ea46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context_59/deeplabv3_r101-d8_480x480_80k_pascal_context_59_20210416_113002-26303993.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_40k_cityscapes/deeplabv3plus_r50-d8_512x1024_40k_cityscapes_20200605_094610-d222ffcd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_40k_cityscapes/deeplabv3plus_r101-d8_512x1024_40k_cityscapes_20200605_094614-3769eecf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_40k_cityscapes/deeplabv3plus_r50-d8_769x769_40k_cityscapes_20200606_114143-1dcb0e3c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_40k_cityscapes/deeplabv3plus_r101-d8_769x769_40k_cityscapes_20200606_114304-ff414b9e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x1024_80k_cityscapes/deeplabv3plus_r18-d8_512x1024_80k_cityscapes_20201226_080942-cff257fe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_80k_cityscapes/deeplabv3plus_r50-d8_512x1024_80k_cityscapes_20200606_114049-f9fb496d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_cityscapes_20200606_114143-068fcfe9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_769x769_80k_cityscapes/deeplabv3plus_r18-d8_769x769_80k_cityscapes_20201226_083346-f326e06a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_80k_cityscapes/deeplabv3plus_r50-d8_769x769_80k_cityscapes_20200606_210233-0e9dfdc4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_80k_cityscapes/deeplabv3plus_r101-d8_769x769_80k_cityscapes_20200607_000405-a7573d20.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-cf9ce186.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-ee6158e0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes_20201226_090828-e451abd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes_20201225_213645-a97e4e43.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes_20201226_190843-9c3c93a4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_769x769_80k_cityscapes/deeplabv3plus_r18b-d8_769x769_80k_cityscapes_20201226_151312-2c868aff.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_769x769_80k_cityscapes/deeplabv3plus_r50b-d8_769x769_80k_cityscapes_20201225_224655-8b596d1c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_769x769_80k_cityscapes/deeplabv3plus_r101b-d8_769x769_80k_cityscapes_20201226_205041-227cdf7c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_ade20k/deeplabv3plus_r50-d8_512x512_80k_ade20k_20200614_185028-bf1400d8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_ade20k/deeplabv3plus_r101-d8_512x512_80k_ade20k_20200615_014139-d5730af7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_160k_ade20k/deeplabv3plus_r50-d8_512x512_160k_ade20k_20200615_124504-6135c7e0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_160k_ade20k/deeplabv3plus_r101-d8_512x512_160k_ade20k_20200615_123232-38ed86bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_20k_voc12aug/deeplabv3plus_r50-d8_512x512_20k_voc12aug_20200617_102323-aad58ef1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_20k_voc12aug/deeplabv3plus_r101-d8_512x512_20k_voc12aug_20200617_102345-c7ff3d56.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_40k_voc12aug/deeplabv3plus_r50-d8_512x512_40k_voc12aug_20200613_161759-e1b43aa9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_40k_voc12aug/deeplabv3plus_r101-d8_512x512_40k_voc12aug_20200613_205333-faf03387.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context/deeplabv3plus_r101-d8_480x480_40k_pascal_context_20200911_165459-d3c8a29e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context/deeplabv3plus_r101-d8_480x480_80k_pascal_context_20200911_155322-145d3ee8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59_20210416_111233-ed937f15.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/deeplabv3plus/metafile.yml | DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59_20210416_111127-7ca0331d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_40k_cityscapes/dmnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_40k_cityscapes/dmnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_40k_cityscapes/dmnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_40k_cityscapes/dmnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_80k_cityscapes/dmnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_80k_cityscapes/dmnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_80k_cityscapes/dmnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_80k_cityscapes/dmnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_80k_ade20k/dmnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_80k_ade20k/dmnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_160k_ade20k/dmnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dmnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_160k_ade20k/dmnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_40k_cityscapes/dnl_r50-d8_512x1024_40k_cityscapes_20200904_233629-53d4ea93.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_40k_cityscapes/dnl_r101-d8_512x1024_40k_cityscapes_20200904_233629-9928ffef.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_40k_cityscapes/dnl_r50-d8_769x769_40k_cityscapes_20200820_232206-0f283785.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_40k_cityscapes/dnl_r101-d8_769x769_40k_cityscapes_20200820_171256-76c596df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_80k_cityscapes/dnl_r50-d8_512x1024_80k_cityscapes_20200904_233629-58b2f778.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_80k_cityscapes/dnl_r101-d8_512x1024_80k_cityscapes_20200904_233629-758e2dd4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_80k_cityscapes/dnl_r50-d8_769x769_80k_cityscapes_20200820_011925-366bc4c7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_80k_cityscapes/dnl_r101-d8_769x769_80k_cityscapes_20200821_051111-95ff84ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_80k_ade20k/dnl_r50-d8_512x512_80k_ade20k_20200826_183354-1cf6e0c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_80k_ade20k/dnl_r101-d8_512x512_80k_ade20k_20200826_183354-d820d6ea.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_160k_ade20k/dnl_r50-d8_512x512_160k_ade20k_20200826_183350-37837798.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/dnlnet/metafile.yml | DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_160k_ade20k/dnl_r101-d8_512x512_160k_ade20k_20200826_183350-ed522c61.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/emanet/metafile.yml | DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_512x1024_80k_cityscapes/emanet_r50-d8_512x1024_80k_cityscapes_20200901_100301-c43fcef1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/emanet/metafile.yml | DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_512x1024_80k_cityscapes/emanet_r101-d8_512x1024_80k_cityscapes_20200901_100301-2d970745.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/emanet/metafile.yml | DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_769x769_80k_cityscapes/emanet_r50-d8_769x769_80k_cityscapes_20200901_100301-16f8de52.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/emanet/metafile.yml | DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_769x769_80k_cityscapes/emanet_r101-d8_769x769_80k_cityscapes_20200901_100301-47a324ce.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_40k_cityscapes/encnet_r50-d8_512x1024_40k_cityscapes_20200621_220958-68638a47.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_40k_cityscapes/encnet_r101-d8_512x1024_40k_cityscapes_20200621_220933-35e0a3e8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_40k_cityscapes/encnet_r50-d8_769x769_40k_cityscapes_20200621_220958-3bcd2884.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_40k_cityscapes/encnet_r101-d8_769x769_40k_cityscapes_20200621_220933-2fafed55.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_80k_cityscapes/encnet_r50-d8_512x1024_80k_cityscapes_20200622_003554-fc5c5624.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_80k_cityscapes/encnet_r101-d8_512x1024_80k_cityscapes_20200622_003555-1de64bec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_80k_cityscapes/encnet_r50-d8_769x769_80k_cityscapes_20200622_003554-55096dcb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_80k_cityscapes/encnet_r101-d8_769x769_80k_cityscapes_20200622_003555-470ef79d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_80k_ade20k/encnet_r50-d8_512x512_80k_ade20k_20200622_042412-44b46b04.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_80k_ade20k/encnet_r101-d8_512x512_80k_ade20k_20200622_101128-dd35e237.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_160k_ade20k/encnet_r50-d8_512x512_160k_ade20k_20200622_101059-b2db95e0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/encnet/metafile.yml | DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_160k_ade20k/encnet_r101-d8_512x512_160k_ade20k_20200622_073348-7989641f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fastscnn/metafile.yml | DeeplabV3_for_Pytorch/configs/fastscnn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fast_scnn/fast_scnn_4x8_80k_lr0.12_cityscapes-f5096c79.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_40k_cityscapes/fcn_r50-d8_512x1024_40k_cityscapes_20200604_192608-efe53f0d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_40k_cityscapes/fcn_r101-d8_512x1024_40k_cityscapes_20200604_181852-a883d3a1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_40k_cityscapes/fcn_r50-d8_769x769_40k_cityscapes_20200606_113104-977b5d02.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_40k_cityscapes/fcn_r101-d8_769x769_40k_cityscapes_20200606_113208-7d4ab69c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_512x1024_80k_cityscapes/fcn_r18-d8_512x1024_80k_cityscapes_20201225_021327-6c50f8b4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_80k_cityscapes/fcn_r50-d8_512x1024_80k_cityscapes_20200606_113019-03aa804d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_80k_cityscapes/fcn_r101-d8_512x1024_80k_cityscapes_20200606_113038-3fb937eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_769x769_80k_cityscapes/fcn_r18-d8_769x769_80k_cityscapes_20201225_021451-9739d1b8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_80k_cityscapes/fcn_r50-d8_769x769_80k_cityscapes_20200606_195749-f5caeabc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_80k_cityscapes/fcn_r101-d8_769x769_80k_cityscapes_20200606_214354-45cbac68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_512x1024_80k_cityscapes/fcn_r18b-d8_512x1024_80k_cityscapes_20201225_230143-92c0f445.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_512x1024_80k_cityscapes/fcn_r50b-d8_512x1024_80k_cityscapes_20201225_094221-82957416.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_512x1024_80k_cityscapes/fcn_r101b-d8_512x1024_80k_cityscapes_20201226_160213-4543858f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_769x769_80k_cityscapes/fcn_r18b-d8_769x769_80k_cityscapes_20201226_004430-32d504e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_769x769_80k_cityscapes/fcn_r50b-d8_769x769_80k_cityscapes_20201225_094223-94552d38.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_769x769_80k_cityscapes/fcn_r101b-d8_769x769_80k_cityscapes_20201226_170012-82be37e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_40k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes-98d5d1bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_80k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes-98d5d1bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_40k_cityscapes/fcn_d6_r50-d16_769x769_40k_cityscapes-1aab18ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_80k_cityscapes/fcn_d6_r50-d16_769x769_80k_cityscapes-109d88eb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_40k_cityscapes/fcn_d6_r101-d16_512x1024_40k_cityscapes-9cf2b450.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_80k_cityscapes/fcn_d6_r101-d16_512x1024_80k_cityscapes-cb336445.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_40k_cityscapes/fcn_d6_r101-d16_769x769_40k_cityscapes-60b114e9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_80k_cityscapes/fcn_d6_r101-d16_769x769_80k_cityscapes-e33adc4f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_80k_ade20k/fcn_r50-d8_512x512_80k_ade20k_20200614_144016-f8ac5082.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_80k_ade20k/fcn_r101-d8_512x512_80k_ade20k_20200615_014143-bc1809f7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_160k_ade20k/fcn_r50-d8_512x512_160k_ade20k_20200615_100713-4edbc3b4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_160k_ade20k/fcn_r101-d8_512x512_160k_ade20k_20200615_105816-fd192bd5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_20k_voc12aug/fcn_r50-d8_512x512_20k_voc12aug_20200617_010715-52dc5306.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_20k_voc12aug/fcn_r101-d8_512x512_20k_voc12aug_20200617_010842-0bb4e798.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_40k_voc12aug/fcn_r50-d8_512x512_40k_voc12aug_20200613_161222-5e2dbf40.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_40k_voc12aug/fcn_r101-d8_512x512_40k_voc12aug_20200613_161240-4c8bcefd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context/fcn_r101-d8_480x480_40k_pascal_context-20210421_154757-b5e97937.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context/fcn_r101-d8_480x480_80k_pascal_context-20210421_163310-4711813f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context_59/fcn_r101-d8_480x480_40k_pascal_context_59_20210415_230724-8cf83682.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fcn/metafile.yml | DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context_59/fcn_r101-d8_480x480_80k_pascal_context_59_20210416_110804-9a6f2c94.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fp16/metafile.yml | DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/fcn_r101-d8_512x1024_80k_fp16_cityscapes/fcn_r101-d8_512x1024_80k_fp16_cityscapes-50245227.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fp16/metafile.yml | DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/pspnet_r101-d8_512x1024_80k_fp16_cityscapes/pspnet_r101-d8_512x1024_80k_fp16_cityscapes-ade37931.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fp16/metafile.yml | DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/deeplabv3_r101-d8_512x1024_80k_fp16_cityscapes/deeplabv3_r101-d8_512x1024_80k_fp16_cityscapes-bc86dc84.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/fp16/metafile.yml | DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/deeplabv3plus_r101-d8_512x1024_80k_fp16_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_fp16_cityscapes-cc58bc8d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_40k_cityscapes/gcnet_r50-d8_512x1024_40k_cityscapes_20200618_074436-4b0fd17b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_40k_cityscapes/gcnet_r101-d8_512x1024_40k_cityscapes_20200618_074436-5e62567f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_40k_cityscapes/gcnet_r50-d8_769x769_40k_cityscapes_20200618_182814-a26f4471.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_40k_cityscapes/gcnet_r101-d8_769x769_40k_cityscapes_20200619_092550-ca4f0a84.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes/gcnet_r50-d8_512x1024_80k_cityscapes_20200618_074450-ef8f069b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_80k_cityscapes/gcnet_r101-d8_512x1024_80k_cityscapes_20200618_074450-778ebf69.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_80k_cityscapes/gcnet_r50-d8_769x769_80k_cityscapes_20200619_092516-4839565b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_80k_cityscapes/gcnet_r101-d8_769x769_80k_cityscapes_20200619_092628-8e043423.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_80k_ade20k/gcnet_r50-d8_512x512_80k_ade20k_20200614_185146-91a6da41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_80k_ade20k/gcnet_r101-d8_512x512_80k_ade20k_20200615_020811-c3fcb6dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_160k_ade20k/gcnet_r50-d8_512x512_160k_ade20k_20200615_224122-d95f3e1f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_160k_ade20k/gcnet_r101-d8_512x512_160k_ade20k_20200615_225406-615528d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_20k_voc12aug/gcnet_r50-d8_512x512_20k_voc12aug_20200617_165701-3cbfdab1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_20k_voc12aug/gcnet_r101-d8_512x512_20k_voc12aug_20200617_165713-6c720aa9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_40k_voc12aug/gcnet_r50-d8_512x512_40k_voc12aug_20200613_195105-9797336d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/gcnet/metafile.yml | DeeplabV3_for_Pytorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_40k_voc12aug/gcnet_r101-d8_512x512_40k_voc12aug_20200613_185806-1e38208d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_40k_cityscapes/fcn_hr18s_512x1024_40k_cityscapes_20200601_014216-93db27d0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_40k_cityscapes/fcn_hr18_512x1024_40k_cityscapes_20200601_014216-f196fb4e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_40k_cityscapes/fcn_hr48_512x1024_40k_cityscapes_20200601_014240-a989b146.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_80k_cityscapes/fcn_hr18s_512x1024_80k_cityscapes_20200601_202700-1462b75d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_80k_cityscapes/fcn_hr18_512x1024_80k_cityscapes_20200601_223255-4e7b345e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_80k_cityscapes/fcn_hr48_512x1024_80k_cityscapes_20200601_202606-58ea95d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_160k_cityscapes/fcn_hr18s_512x1024_160k_cityscapes_20200602_190901-4a0797ea.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_160k_cityscapes/fcn_hr18_512x1024_160k_cityscapes_20200602_190822-221e4a4f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_160k_cityscapes/fcn_hr48_512x1024_160k_cityscapes_20200602_190946-59b7973e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_ade20k/fcn_hr18s_512x512_80k_ade20k_20200614_144345-77fc814a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_ade20k/fcn_hr18_512x512_80k_ade20k_20200614_185145-66f20cb7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_ade20k/fcn_hr48_512x512_80k_ade20k_20200614_193946-7ba5258d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_160k_ade20k/fcn_hr18s_512x512_160k_ade20k_20200614_214413-870f65ac.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_160k_ade20k/fcn_hr18_512x512_160k_ade20k_20200614_214426-ca961836.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_160k_ade20k/fcn_hr48_512x512_160k_ade20k_20200614_214407-a52fc02c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_20k_voc12aug/fcn_hr18s_512x512_20k_voc12aug_20200617_224503-56e36088.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_20k_voc12aug/fcn_hr18_512x512_20k_voc12aug_20200617_224503-488d45f7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_20k_voc12aug/fcn_hr48_512x512_20k_voc12aug_20200617_224419-89de05cd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_40k_voc12aug/fcn_hr18s_512x512_40k_voc12aug_20200614_000648-4f8d6e7f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_40k_voc12aug/fcn_hr18_512x512_40k_voc12aug_20200613_224401-1b4b76cd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_40k_voc12aug/fcn_hr48_512x512_40k_voc12aug_20200613_222111-1b0f18bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context/fcn_hr48_480x480_40k_pascal_context_20200911_164852-667d00b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context/fcn_hr48_480x480_80k_pascal_context_20200911_155322-847a6711.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context_59/fcn_hr48_480x480_40k_pascal_context_59_20210410_122738-b808b8b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/hrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context_59/fcn_hr48_480x480_80k_pascal_context_59_20210411_003240-3ae7081e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x1024_80k_cityscapes/fcn_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-d24c28c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x1024_80k_cityscapes/pspnet_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-19e81d51.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x1024_80k_cityscapes/deeplabv3_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-bef03590.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-d256dd4b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x512_160k_ade20k/fcn_m-v2-d8_512x512_160k_ade20k_20200825_214953-c40e1095.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x512_160k_ade20k/pspnet_m-v2-d8_512x512_160k_ade20k_20200825_214953-f5942f7a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x512_160k_ade20k/deeplabv3_m-v2-d8_512x512_160k_ade20k_20200825_223255-63986343.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v2/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x512_160k_ade20k/deeplabv3plus_m-v2-d8_512x512_160k_ade20k_20200825_223255-465a01d4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v3/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_512x1024_320k_cityscapes/lraspp_m-v3-d8_512x1024_320k_cityscapes_20201224_220337-cfe8fb07.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v3/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes_20201224_220337-9f29cd72.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v3/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes/lraspp_m-v3s-d8_512x1024_320k_cityscapes_20201224_223935-61565b34.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/mobilenet_v3/metafile.yml | DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes_20201224_223935-03daeabb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_40k_cityscapes/nonlocal_r50-d8_512x1024_40k_cityscapes_20200605_210748-c75e81e3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_40k_cityscapes/nonlocal_r101-d8_512x1024_40k_cityscapes_20200605_210748-d63729fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_40k_cityscapes/nonlocal_r50-d8_769x769_40k_cityscapes_20200530_045243-82ef6749.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_40k_cityscapes/nonlocal_r101-d8_769x769_40k_cityscapes_20200530_045348-8fe9a9dc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_80k_cityscapes/nonlocal_r50-d8_512x1024_80k_cityscapes_20200607_193518-d6839fae.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_80k_cityscapes/nonlocal_r101-d8_512x1024_80k_cityscapes_20200607_183411-32700183.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_80k_cityscapes/nonlocal_r50-d8_769x769_80k_cityscapes_20200607_193506-1f9792f6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_80k_cityscapes/nonlocal_r101-d8_769x769_80k_cityscapes_20200607_183428-0e1fa4f9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_80k_ade20k/nonlocal_r50-d8_512x512_80k_ade20k_20200615_015801-5ae0aa33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_80k_ade20k/nonlocal_r101-d8_512x512_80k_ade20k_20200615_015758-24105919.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_160k_ade20k/nonlocal_r50-d8_512x512_160k_ade20k_20200616_005410-baef45e3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_160k_ade20k/nonlocal_r101-d8_512x512_160k_ade20k_20200616_003422-affd0f8d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_20k_voc12aug/nonlocal_r50-d8_512x512_20k_voc12aug_20200617_222613-07f2a57c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_20k_voc12aug/nonlocal_r101-d8_512x512_20k_voc12aug_20200617_222615-948c68ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_40k_voc12aug/nonlocal_r50-d8_512x512_40k_voc12aug_20200614_000028-0139d4a9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/nonlocal_net/metafile.yml | DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_40k_voc12aug/nonlocal_r101-d8_512x512_40k_voc12aug_20200614_000028-7e5ff470.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_40k_cityscapes/ocrnet_hr18s_512x1024_40k_cityscapes_20200601_033304-fa2436c2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_40k_cityscapes/ocrnet_hr18_512x1024_40k_cityscapes_20200601_033320-401c5bdd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_40k_cityscapes/ocrnet_hr48_512x1024_40k_cityscapes_20200601_033336-55b32491.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_80k_cityscapes/ocrnet_hr18s_512x1024_80k_cityscapes_20200601_222735-55979e63.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_80k_cityscapes/ocrnet_hr18_512x1024_80k_cityscapes_20200614_230521-c2e1dd4a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_80k_cityscapes/ocrnet_hr48_512x1024_80k_cityscapes_20200601_222752-9076bcdf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_160k_cityscapes/ocrnet_hr18s_512x1024_160k_cityscapes_20200602_191005-f4a7af28.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_160k_cityscapes/ocrnet_hr18_512x1024_160k_cityscapes_20200602_191001-b9172d0c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_160k_cityscapes/ocrnet_hr48_512x1024_160k_cityscapes_20200602_191037-dfbf1b0c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://github.com/open-mmlab/mmsegmentation/blob/master/configs/ocrnet/ocrnet_r101-d8_512x1024_40k_b8_cityscapes.py | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b8_cityscapes/ocrnet_r101-d8_512x1024_40k_b8_cityscapes-02ac0f13.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b16_cityscapes/ocrnet_r101-d8_512x1024_40k_b16_cityscapes-db500f80.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_80k_ade20k/ocrnet_hr18s_512x512_80k_ade20k_20200615_055600-e80b62af.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_80k_ade20k/ocrnet_hr18_512x512_80k_ade20k_20200615_053157-d173d83b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_80k_ade20k/ocrnet_hr48_512x512_80k_ade20k_20200615_021518-d168c2d1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_160k_ade20k/ocrnet_hr18s_512x512_160k_ade20k_20200615_184505-8e913058.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_160k_ade20k/ocrnet_hr18_512x512_160k_ade20k_20200615_200940-d8fcd9d1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_160k_ade20k/ocrnet_hr48_512x512_160k_ade20k_20200615_184705-a073726d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_20k_voc12aug/ocrnet_hr18s_512x512_20k_voc12aug_20200617_233913-02b04fcb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_20k_voc12aug/ocrnet_hr18_512x512_20k_voc12aug_20200617_233932-8954cbb7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_20k_voc12aug/ocrnet_hr48_512x512_20k_voc12aug_20200617_233932-9e82080a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_40k_voc12aug/ocrnet_hr18s_512x512_40k_voc12aug_20200614_002025-42b587ac.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_40k_voc12aug/ocrnet_hr18_512x512_40k_voc12aug_20200614_015958-714302be.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/ocrnet/metafile.yml | DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_40k_voc12aug/ocrnet_hr48_512x512_40k_voc12aug_20200614_015958-255bc5ce.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/point_rend/metafile.yml | DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x1024_80k_cityscapes/pointrend_r50_512x1024_80k_cityscapes_20200711_015821-bb1ff523.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/point_rend/metafile.yml | DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x1024_80k_cityscapes/pointrend_r101_512x1024_80k_cityscapes_20200711_170850-d0ca84be.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/point_rend/metafile.yml | DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x512_160k_ade20k/pointrend_r50_512x512_160k_ade20k_20200807_232644-ac3febf2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/point_rend/metafile.yml | DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x512_160k_ade20k/pointrend_r101_512x512_160k_ade20k_20200808_030852-8834902a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_40k_cityscapes/psanet_r50-d8_512x1024_40k_cityscapes_20200606_103117-99fac37c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_40k_cityscapes/psanet_r101-d8_512x1024_40k_cityscapes_20200606_001418-27b9cfa7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_40k_cityscapes/psanet_r50-d8_769x769_40k_cityscapes_20200530_033717-d5365506.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_40k_cityscapes/psanet_r101-d8_769x769_40k_cityscapes_20200530_035107-997da1e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_80k_cityscapes/psanet_r50-d8_512x1024_80k_cityscapes_20200606_161842-ab60a24f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_80k_cityscapes/psanet_r101-d8_512x1024_80k_cityscapes_20200606_161823-0f73a169.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_80k_cityscapes/psanet_r50-d8_769x769_80k_cityscapes_20200606_225134-fe42f49e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_80k_cityscapes/psanet_r101-d8_769x769_80k_cityscapes_20200606_214550-7665827b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_80k_ade20k/psanet_r50-d8_512x512_80k_ade20k_20200614_144141-835e4b97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_80k_ade20k/psanet_r101-d8_512x512_80k_ade20k_20200614_185117-1fab60d4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_160k_ade20k/psanet_r50-d8_512x512_160k_ade20k_20200615_161258-148077dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_160k_ade20k/psanet_r101-d8_512x512_160k_ade20k_20200615_161537-dbfa564c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_20k_voc12aug/psanet_r50-d8_512x512_20k_voc12aug_20200617_102413-2f1bbaa1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_20k_voc12aug/psanet_r101-d8_512x512_20k_voc12aug_20200617_110624-946fef11.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_40k_voc12aug/psanet_r50-d8_512x512_40k_voc12aug_20200613_161946-f596afb5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/psanet/metafile.yml | DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_40k_voc12aug/psanet_r101-d8_512x512_40k_voc12aug_20200613_161946-1f560f9e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_40k_cityscapes/pspnet_r50-d8_512x1024_40k_cityscapes_20200605_003338-2966598c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_40k_cityscapes/pspnet_r101-d8_512x1024_40k_cityscapes_20200604_232751-467e7cf4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_40k_cityscapes/pspnet_r50-d8_769x769_40k_cityscapes_20200606_112725-86638686.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_40k_cityscapes/pspnet_r101-d8_769x769_40k_cityscapes_20200606_112753-61c6f5be.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x1024_80k_cityscapes/pspnet_r18-d8_512x1024_80k_cityscapes_20201225_021458-09ffa746.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_80k_cityscapes/pspnet_r50-d8_512x1024_80k_cityscapes_20200606_112131-2376f12b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_80k_cityscapes/pspnet_r101-d8_512x1024_80k_cityscapes_20200606_112211-e1e1100f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_769x769_80k_cityscapes/pspnet_r18-d8_769x769_80k_cityscapes_20201225_021458-3deefc62.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_80k_cityscapes/pspnet_r50-d8_769x769_80k_cityscapes_20200606_210121-5ccf03dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_80k_cityscapes/pspnet_r101-d8_769x769_80k_cityscapes_20200606_225055-dba412fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_512x1024_80k_cityscapes/pspnet_r18b-d8_512x1024_80k_cityscapes_20201226_063116-26928a60.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_512x1024_80k_cityscapes/pspnet_r50b-d8_512x1024_80k_cityscapes_20201225_094315-6344287a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_512x1024_80k_cityscapes/pspnet_r101b-d8_512x1024_80k_cityscapes_20201226_170012-3a4d38ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_769x769_80k_cityscapes/pspnet_r18b-d8_769x769_80k_cityscapes_20201226_080942-bf98d186.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_769x769_80k_cityscapes/pspnet_r50b-d8_769x769_80k_cityscapes_20201225_094316-4c643cf6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_769x769_80k_cityscapes/pspnet_r101b-d8_769x769_80k_cityscapes_20201226_171823-f0e7c293.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_ade20k/pspnet_r50-d8_512x512_80k_ade20k_20200615_014128-15a8b914.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_ade20k/pspnet_r101-d8_512x512_80k_ade20k_20200614_031423-b6e782f0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_160k_ade20k/pspnet_r50-d8_512x512_160k_ade20k_20200615_184358-1890b0bd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_160k_ade20k/pspnet_r101-d8_512x512_160k_ade20k_20200615_100650-967c316f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_20k_voc12aug/pspnet_r50-d8_512x512_20k_voc12aug_20200617_101958-ed5dfbd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_20k_voc12aug/pspnet_r101-d8_512x512_20k_voc12aug_20200617_102003-4aef3c9a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_40k_voc12aug/pspnet_r50-d8_512x512_40k_voc12aug_20200613_161222-ae9c1b8c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_40k_voc12aug/pspnet_r101-d8_512x512_40k_voc12aug_20200613_161222-bc933b18.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context/pspnet_r101-d8_480x480_40k_pascal_context_20200911_211210-bf0f5d7c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context/pspnet_r101-d8_480x480_80k_pascal_context_20200911_190530-c86d6233.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context_59/pspnet_r101-d8_480x480_40k_pascal_context_59_20210416_114524-86d44cd4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context_59/pspnet_r101-d8_480x480_80k_pascal_context_59_20210416_114418-fa6caaa2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/pspnet/metafile.yml | DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x1024_80k_cityscapes/fcn_s101-d8_512x1024_80k_cityscapes_20200807_140631-f8d155b3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x1024_80k_cityscapes/pspnet_s101-d8_512x1024_80k_cityscapes_20200807_140631-c75f3b99.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x1024_80k_cityscapes/deeplabv3_s101-d8_512x1024_80k_cityscapes_20200807_144429-b73c4270.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x1024_80k_cityscapes/deeplabv3plus_s101-d8_512x1024_80k_cityscapes_20200807_144429-1239eb43.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x512_160k_ade20k/fcn_s101-d8_512x512_160k_ade20k_20200807_145416-d3160329.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x512_160k_ade20k/pspnet_s101-d8_512x512_160k_ade20k_20200807_145416-a6daa92a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x512_160k_ade20k/deeplabv3_s101-d8_512x512_160k_ade20k_20200807_144503-17ecabe5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/resnest/metafile.yml | DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x512_160k_ade20k/deeplabv3plus_s101-d8_512x512_160k_ade20k_20200807_144503-27b26226.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/sem_fpn/metafile.yml | DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x1024_80k_cityscapes/fpn_r50_512x1024_80k_cityscapes_20200717_021437-94018a0d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/sem_fpn/metafile.yml | DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x1024_80k_cityscapes/fpn_r101_512x1024_80k_cityscapes_20200717_012416-c5800d4c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/sem_fpn/metafile.yml | DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x512_160k_ade20k/fpn_r50_512x512_160k_ade20k_20200718_131734-5b5a6ab9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/sem_fpn/metafile.yml | DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x512_160k_ade20k/fpn_r101_512x512_160k_ade20k_20200718_131734-306b5004.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_64x64_40k_drive/fcn_unet_s5-d16_64x64_40k_drive_20201223_191051-26cee593.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_64x64_40k_drive/pspnet_unet_s5-d16_64x64_40k_drive_20201227_181818-aac73387.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_64x64_40k_drive/deeplabv3_unet_s5-d16_64x64_40k_drive_20201226_094047-0671ff20.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_stare/fcn_unet_s5-d16_128x128_40k_stare_20201223_191051-6ea7cfda.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_stare/pspnet_unet_s5-d16_128x128_40k_stare_20201227_181818-3c2923c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_stare/deeplabv3_unet_s5-d16_128x128_40k_stare_20201226_094047-93dcb93c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_chase_db1/fcn_unet_s5-d16_128x128_40k_chase_db1_20201223_191051-95852f45.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_chase_db1/pspnet_unet_s5-d16_128x128_40k_chase_db1_20201227_181818-68d4e609.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_chase_db1/deeplabv3_unet_s5-d16_128x128_40k_chase_db1_20201226_094047-4c5aefa3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_256x256_40k_hrf/fcn_unet_s5-d16_256x256_40k_hrf_20201223_173724-df3ec8c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_256x256_40k_hrf/pspnet_unet_s5-d16_256x256_40k_hrf_20201227_181818-fdb7e29b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/unet/metafile.yml | DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_256x256_40k_hrf/deeplabv3_unet_s5-d16_256x256_40k_hrf_20201226_094047-3a1fdf85.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_40k_cityscapes/upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_40k_cityscapes/upernet_r101_512x1024_40k_cityscapes_20200605_094933-ebce3b10.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_40k_cityscapes/upernet_r50_769x769_40k_cityscapes_20200530_033048-92d21539.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_40k_cityscapes/upernet_r101_769x769_40k_cityscapes_20200530_040819-83c95d01.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_80k_cityscapes/upernet_r50_512x1024_80k_cityscapes_20200607_052207-848beca8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_80k_cityscapes/upernet_r101_512x1024_80k_cityscapes_20200607_002403-f05f2345.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_80k_cityscapes/upernet_r50_769x769_80k_cityscapes_20200607_005107-82ae7d15.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_80k_cityscapes/upernet_r101_769x769_80k_cityscapes_20200607_001014-082fc334.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_80k_ade20k/upernet_r50_512x512_80k_ade20k_20200614_144127-ecc8377b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_80k_ade20k/upernet_r101_512x512_80k_ade20k_20200614_185117-32e4db94.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_160k_ade20k/upernet_r50_512x512_160k_ade20k_20200615_184328-8534de8d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_160k_ade20k/upernet_r101_512x512_160k_ade20k_20200615_161951-91b32684.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_20k_voc12aug/upernet_r50_512x512_20k_voc12aug_20200617_165330-5b5890a7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_20k_voc12aug/upernet_r101_512x512_20k_voc12aug_20200617_165629-f14e7f27.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_40k_voc12aug/upernet_r50_512x512_40k_voc12aug_20200613_162257-ca9bcc6b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/upernet/metafile.yml | DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_40k_voc12aug/upernet_r101_512x512_40k_voc12aug_20200613_163549-e26476ac.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docker/Dockerfile | DeeplabV3_for_Pytorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docker/Dockerfil | DeeplabV3_for_Pytorch/docker/Dockerfile | https://github.com/open-mmlab/mmsegmentation.git | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docker/serve/Dockerfile | DeeplabV3_for_Pytorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/Dockerfile | DeeplabV3_for_Pytorch/Dockerfile | https://github.com/open-mmlab/mmcv.git | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docs/stat.py | DeeplabV3_for_Pytorch/docs/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docs_zh-CN/stat.py | DeeplabV3_for_Pytorch/docs_zh-CN/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/Dockerfile | DeeplabV3_for_Pytorch/env_set.sh | https://github.com/open-mmlab/mmcv.git | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/setup.py | DeeplabV3_for_Pytorch/setup.py | openmmlab@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/setup.py | DeeplabV3_for_Pytorch/setup.py | http://github.com/open-mmlab/mmsegmentation | 源码地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/necks/mla_neck.py | DeeplabV3_for_Pytorch/mmseg/models/necks/mla_neck.py | https://arxiv.org/pdf/2012.15840.pdf | 论文地址 | -| 开发引入 | / | DeeplabV3_for_Pytorch/mmseg/models/losses/dice_loss.py | https://github.com/LikeLy-Journey/SegmenTron/blob/master/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | DeeplabV3_for_Pytorch/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/cc_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/swin.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/swin.py | https://github.com/microsoft/Swin-Transformer | 源码实现 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 预训练模型 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_512x512_80k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 预训练模型 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_ln_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/psa_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/psa_head.py | https://hszhao.github.io/papers/eccv18_psanet.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/point_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/aspp_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | DeeplabV3_for_Pytorch/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_tiny_patch4_window7_224.pth | 预训练模型 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/ema_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 论文地址 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_ln_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/vit.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现 | -| 开发引入 | / | DeeplabV3_for_Pytorch/mmcv_need/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py | DeeplabV3_for_Pytorch/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window7_224_22k.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/dm_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/dm_head.py | https://openaccess.thecvf.com/content_ICCV_2019/papers/ | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/mobilenet_v3.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/vit.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/vit.py | https://arxiv.org/abs/2010.11929 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/ann_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/losses/dice_loss.py | DeeplabV3_for_Pytorch/mmseg/models/losses/dice_loss.py | https://arxiv.org/abs/1606.04797 | 论文地址 | -| 开发引入 | / | DeeplabV3_for_Pytorch/mmseg/models/losses/lovasz_loss.py | https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/ocr_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/enc_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/apc_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/da_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docs_zh-CN/make.bat | DeeplabV3_for_Pytorch/docs_zh-CN/make.bat | http://sphinx-doc.org/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docs/conf.py | DeeplabV3_for_Pytorch/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/psp_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/fcn_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/core/seg/sampler/ohem_pixel_sampler.py | DeeplabV3_for_Pytorch/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/losses/lovasz_loss.py | DeeplabV3_for_Pytorch/mmseg/models/losses/lovasz_loss.py | https://arxiv.org/abs/1705.08790 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/nl_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_512x512_80k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/utils/self_attention_block.py | DeeplabV3_for_Pytorch/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docs_zh-CN/conf.py | DeeplabV3_for_Pytorch/docs_zh-CN/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/gc_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/setr_up_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/setr_up_head.py | https://arxiv.org/pdf/2012.15840.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/setr_mla_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/setr_mla_head.py | https://arxiv.org/pdf/2012.15840.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/swin.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/swin.py | https://arxiv.org/abs/2103.14030 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/sep_aspp_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/lraspp_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/setup.py | DeeplabV3_for_Pytorch/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/resnet.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/datasets/builder.py | DeeplabV3_for_Pytorch/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/uper_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/utils/make_divisible.py | DeeplabV3_for_Pytorch/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 论文实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py | DeeplabV3_for_Pytorch/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/fpn_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 论文地址 | -| 开发引入 | / | DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/unet.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/mit.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/mit.py | https://arxiv.org/pdf/2105.15203.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | DeeplabV3_for_Pytorch/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_small_patch4_window7_224.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/decode_heads/dnl_head.py | DeeplabV3_for_Pytorch/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/necks/fpn.py | DeeplabV3_for_Pytorch/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开发引入 | / | DeeplabV3_for_Pytorch/tests/test_metrics.py | https://pytorch.org/docs/stable/generated/torch.histc.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py | DeeplabV3_for_Pytorch/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py | https://github.com/SwinTransformer/storage/releases/download/v1.0.0/swin_base_patch4_window12_384_22k.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/hrnet.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/mmseg/models/backbones/cgnet.py | DeeplabV3_for_Pytorch/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docs/make.bat | DeeplabV3_for_Pytorch/docs/make.bat | http://sphinx-doc.org/ | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_40k_cityscapes/ann_r50-d8_512x1024_40k_cityscapes_20200605_095211-049fc292.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_40k_cityscapes/ann_r101-d8_512x1024_40k_cityscapes_20200605_095243-adf6eece.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_40k_cityscapes/ann_r50-d8_769x769_40k_cityscapes_20200530_025712-2b46b04d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_40k_cityscapes/ann_r101-d8_769x769_40k_cityscapes_20200530_025720-059bff28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_80k_cityscapes/ann_r50-d8_512x1024_80k_cityscapes_20200607_101911-5a9ad545.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_80k_cityscapes/ann_r101-d8_512x1024_80k_cityscapes_20200607_013728-aceccc6e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_80k_cityscapes/ann_r50-d8_769x769_80k_cityscapes_20200607_044426-cc7ff323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_80k_cityscapes/ann_r101-d8_769x769_80k_cityscapes_20200607_013713-a9d4be8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_80k_ade20k/ann_r50-d8_512x512_80k_ade20k_20200615_014818-26f75e11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_80k_ade20k/ann_r101-d8_512x512_80k_ade20k_20200615_014818-c0153543.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_160k_ade20k/ann_r50-d8_512x512_160k_ade20k_20200615_231733-892247bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_160k_ade20k/ann_r101-d8_512x512_160k_ade20k_20200615_231733-955eb1ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_20k_voc12aug/ann_r50-d8_512x512_20k_voc12aug_20200617_222246-dfcb1c62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_20k_voc12aug/ann_r101-d8_512x512_20k_voc12aug_20200617_222246-2fad0042.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_40k_voc12aug/ann_r50-d8_512x512_40k_voc12aug_20200613_231314-b5dac322.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ann/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_40k_voc12aug/ann_r101-d8_512x512_40k_voc12aug_20200613_231314-bd205bbe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_40k_cityscapes/apcnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_40k_cityscapes/apcnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_40k_cityscapes/apcnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_40k_cityscapes/apcnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_80k_cityscapes/apcnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_80k_cityscapes/apcnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_80k_cityscapes/apcnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_80k_cityscapes/apcnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_80k_ade20k/apcnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_80k_ade20k/apcnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_160k_ade20k/apcnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/apcnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_160k_ade20k/apcnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_40k_cityscapes/ccnet_r50-d8_512x1024_40k_cityscapes_20200616_142517-4123f401.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_40k_cityscapes/ccnet_r101-d8_512x1024_40k_cityscapes_20200616_142540-a3b84ba6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_40k_cityscapes/ccnet_r50-d8_769x769_40k_cityscapes_20200616_145125-76d11884.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_40k_cityscapes/ccnet_r101-d8_769x769_40k_cityscapes_20200617_101428-4f57c8d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_80k_cityscapes/ccnet_r50-d8_512x1024_80k_cityscapes_20200617_010421-869a3423.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_80k_cityscapes/ccnet_r101-d8_512x1024_80k_cityscapes_20200617_203935-ffae8917.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_80k_cityscapes/ccnet_r50-d8_769x769_80k_cityscapes_20200617_010421-73eed8ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_80k_cityscapes/ccnet_r101-d8_769x769_80k_cityscapes_20200618_011502-ad3cd481.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_80k_ade20k/ccnet_r50-d8_512x512_80k_ade20k_20200615_014848-aa37f61e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_80k_ade20k/ccnet_r101-d8_512x512_80k_ade20k_20200615_014848-1f4929a3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_160k_ade20k/ccnet_r50-d8_512x512_160k_ade20k_20200616_084435-7c97193b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_160k_ade20k/ccnet_r101-d8_512x512_160k_ade20k_20200616_000644-e849e007.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_20k_voc12aug/ccnet_r50-d8_512x512_20k_voc12aug_20200617_193212-fad81784.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_20k_voc12aug/ccnet_r101-d8_512x512_20k_voc12aug_20200617_193212-0007b61d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_40k_voc12aug/ccnet_r50-d8_512x512_40k_voc12aug_20200613_232127-c2a15f02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ccnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_40k_voc12aug/ccnet_r101-d8_512x512_40k_voc12aug_20200613_232127-c30da577.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/cgnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_680x680_60k_cityscapes/cgnet_680x680_60k_cityscapes_20201101_110253-4c0b2f2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/cgnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_512x1024_60k_cityscapes/cgnet_512x1024_60k_cityscapes_20201101_110254-124ea03b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_40k_cityscapes/danet_r50-d8_512x1024_40k_cityscapes_20200605_191324-c0dbfa5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_40k_cityscapes/danet_r101-d8_512x1024_40k_cityscapes_20200605_200831-c57a7157.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_40k_cityscapes/danet_r50-d8_769x769_40k_cityscapes_20200530_025703-76681c60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_40k_cityscapes/danet_r101-d8_769x769_40k_cityscapes_20200530_025717-dcb7fd4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_80k_cityscapes/danet_r50-d8_512x1024_80k_cityscapes_20200607_133029-2bfa2293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_80k_cityscapes/danet_r101-d8_512x1024_80k_cityscapes_20200607_132918-955e6350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_80k_cityscapes/danet_r50-d8_769x769_80k_cityscapes_20200607_132954-495689b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_80k_cityscapes/danet_r101-d8_769x769_80k_cityscapes_20200607_132918-f3a929e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_80k_ade20k/danet_r50-d8_512x512_80k_ade20k_20200615_015125-edb18e08.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_80k_ade20k/danet_r101-d8_512x512_80k_ade20k_20200615_015126-d0357c73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_160k_ade20k/danet_r50-d8_512x512_160k_ade20k_20200616_082340-9cb35dcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_160k_ade20k/danet_r101-d8_512x512_160k_ade20k_20200616_082348-23bf12f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_20k_voc12aug/danet_r50-d8_512x512_20k_voc12aug_20200618_070026-9e9e3ab3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_20k_voc12aug/danet_r101-d8_512x512_20k_voc12aug_20200618_070026-d48d23b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_40k_voc12aug/danet_r50-d8_512x512_40k_voc12aug_20200613_235526-426e3a64.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/danet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_40k_voc12aug/danet_r101-d8_512x512_40k_voc12aug_20200613_223031-788e232a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_40k_cityscapes/deeplabv3_r50-d8_512x1024_40k_cityscapes_20200605_022449-acadc2f8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_40k_cityscapes/deeplabv3_r101-d8_512x1024_40k_cityscapes_20200605_012241-7fd3f799.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_40k_cityscapes/deeplabv3_r50-d8_769x769_40k_cityscapes_20200606_113723-7eda553c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_40k_cityscapes/deeplabv3_r101-d8_769x769_40k_cityscapes_20200606_113809-c64f889f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_512x1024_80k_cityscapes/deeplabv3_r18-d8_512x1024_80k_cityscapes_20201225_021506-23dffbe2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_80k_cityscapes/deeplabv3_r50-d8_512x1024_80k_cityscapes_20200606_113404-b92cfdd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_80k_cityscapes/deeplabv3_r101-d8_512x1024_80k_cityscapes_20200606_113503-9e428899.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-774d9cec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_769x769_80k_cityscapes/deeplabv3_r18-d8_769x769_80k_cityscapes_20201225_021506-6452126a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_80k_cityscapes/deeplabv3_r50-d8_769x769_80k_cityscapes_20200606_221338-788d6228.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_80k_cityscapes/deeplabv3_r101-d8_769x769_80k_cityscapes_20200607_013353-60e95418.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-57bb8425.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_512x1024_80k_cityscapes/deeplabv3_r18b-d8_512x1024_80k_cityscapes_20201225_094144-46040cef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_512x1024_80k_cityscapes/deeplabv3_r50b-d8_512x1024_80k_cityscapes_20201225_155148-ec368954.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_512x1024_80k_cityscapes/deeplabv3_r101b-d8_512x1024_80k_cityscapes_20201226_171821-8fd49503.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_769x769_80k_cityscapes/deeplabv3_r18b-d8_769x769_80k_cityscapes_20201225_094144-fdc985d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_769x769_80k_cityscapes/deeplabv3_r50b-d8_769x769_80k_cityscapes_20201225_155404-87fb0cf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_769x769_80k_cityscapes/deeplabv3_r101b-d8_769x769_80k_cityscapes_20201226_190843-9142ee57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_80k_ade20k/deeplabv3_r50-d8_512x512_80k_ade20k_20200614_185028-0bb3f844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_80k_ade20k/deeplabv3_r101-d8_512x512_80k_ade20k_20200615_021256-d89c7fa4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_160k_ade20k/deeplabv3_r50-d8_512x512_160k_ade20k_20200615_123227-5d0ee427.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_160k_ade20k/deeplabv3_r101-d8_512x512_160k_ade20k_20200615_105816-b1f72b3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_20k_voc12aug/deeplabv3_r50-d8_512x512_20k_voc12aug_20200617_010906-596905ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_20k_voc12aug/deeplabv3_r101-d8_512x512_20k_voc12aug_20200617_010932-8d13832f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_40k_voc12aug/deeplabv3_r50-d8_512x512_40k_voc12aug_20200613_161546-2ae96e7e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_40k_voc12aug/deeplabv3_r101-d8_512x512_40k_voc12aug_20200613_161432-0017d784.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context/deeplabv3_r101-d8_480x480_40k_pascal_context_20200911_204118-1aa27336.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context/deeplabv3_r101-d8_480x480_80k_pascal_context_20200911_170155-2a21fff3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context_59/deeplabv3_r101-d8_480x480_40k_pascal_context_59_20210416_110332-cb08ea46.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context_59/deeplabv3_r101-d8_480x480_80k_pascal_context_59_20210416_113002-26303993.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-b35f789d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-c49752cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-dc76f3ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-636cb433.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k_20210709_163016-88675c24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k_20210709_201252-13600dc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k_20210709_163016-49f2812b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k_20210709_155402-f035acfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k_20210709_155403-51b21115.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k_20210709_155402-3cbca14d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_40k_cityscapes/deeplabv3plus_r50-d8_512x1024_40k_cityscapes_20200605_094610-d222ffcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_40k_cityscapes/deeplabv3plus_r101-d8_512x1024_40k_cityscapes_20200605_094614-3769eecf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_40k_cityscapes/deeplabv3plus_r50-d8_769x769_40k_cityscapes_20200606_114143-1dcb0e3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_40k_cityscapes/deeplabv3plus_r101-d8_769x769_40k_cityscapes_20200606_114304-ff414b9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x1024_80k_cityscapes/deeplabv3plus_r18-d8_512x1024_80k_cityscapes_20201226_080942-cff257fe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_80k_cityscapes/deeplabv3plus_r50-d8_512x1024_80k_cityscapes_20200606_114049-f9fb496d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_cityscapes_20200606_114143-068fcfe9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_769x769_80k_cityscapes/deeplabv3plus_r18-d8_769x769_80k_cityscapes_20201226_083346-f326e06a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_80k_cityscapes/deeplabv3plus_r50-d8_769x769_80k_cityscapes_20200606_210233-0e9dfdc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_80k_cityscapes/deeplabv3plus_r101-d8_769x769_80k_cityscapes_20200607_000405-a7573d20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-cf9ce186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-ee6158e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes_20201226_090828-e451abd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes_20201225_213645-a97e4e43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes_20201226_190843-9c3c93a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_769x769_80k_cityscapes/deeplabv3plus_r18b-d8_769x769_80k_cityscapes_20201226_151312-2c868aff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_769x769_80k_cityscapes/deeplabv3plus_r50b-d8_769x769_80k_cityscapes_20201225_224655-8b596d1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_769x769_80k_cityscapes/deeplabv3plus_r101b-d8_769x769_80k_cityscapes_20201226_205041-227cdf7c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_ade20k/deeplabv3plus_r50-d8_512x512_80k_ade20k_20200614_185028-bf1400d8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_ade20k/deeplabv3plus_r101-d8_512x512_80k_ade20k_20200615_014139-d5730af7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_160k_ade20k/deeplabv3plus_r50-d8_512x512_160k_ade20k_20200615_124504-6135c7e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_160k_ade20k/deeplabv3plus_r101-d8_512x512_160k_ade20k_20200615_123232-38ed86bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_20k_voc12aug/deeplabv3plus_r50-d8_512x512_20k_voc12aug_20200617_102323-aad58ef1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_20k_voc12aug/deeplabv3plus_r101-d8_512x512_20k_voc12aug_20200617_102345-c7ff3d56.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_40k_voc12aug/deeplabv3plus_r50-d8_512x512_40k_voc12aug_20200613_161759-e1b43aa9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_40k_voc12aug/deeplabv3plus_r101-d8_512x512_40k_voc12aug_20200613_205333-faf03387.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context/deeplabv3plus_r101-d8_480x480_40k_pascal_context_20200911_165459-d3c8a29e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context/deeplabv3plus_r101-d8_480x480_80k_pascal_context_20200911_155322-145d3ee8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59_20210416_111233-ed937f15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/deeplabv3plus/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59_20210416_111127-7ca0331d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_40k_cityscapes/dmnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_40k_cityscapes/dmnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_40k_cityscapes/dmnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_40k_cityscapes/dmnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_80k_cityscapes/dmnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_80k_cityscapes/dmnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_80k_cityscapes/dmnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_80k_cityscapes/dmnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_80k_ade20k/dmnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_80k_ade20k/dmnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_160k_ade20k/dmnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dmnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_160k_ade20k/dmnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_40k_cityscapes/dnl_r50-d8_512x1024_40k_cityscapes_20200904_233629-53d4ea93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_40k_cityscapes/dnl_r101-d8_512x1024_40k_cityscapes_20200904_233629-9928ffef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_40k_cityscapes/dnl_r50-d8_769x769_40k_cityscapes_20200820_232206-0f283785.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_40k_cityscapes/dnl_r101-d8_769x769_40k_cityscapes_20200820_171256-76c596df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_80k_cityscapes/dnl_r50-d8_512x1024_80k_cityscapes_20200904_233629-58b2f778.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_80k_cityscapes/dnl_r101-d8_512x1024_80k_cityscapes_20200904_233629-758e2dd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_80k_cityscapes/dnl_r50-d8_769x769_80k_cityscapes_20200820_011925-366bc4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_80k_cityscapes/dnl_r101-d8_769x769_80k_cityscapes_20200821_051111-95ff84ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_80k_ade20k/dnl_r50-d8_512x512_80k_ade20k_20200826_183354-1cf6e0c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_80k_ade20k/dnl_r101-d8_512x512_80k_ade20k_20200826_183354-d820d6ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_160k_ade20k/dnl_r50-d8_512x512_160k_ade20k_20200826_183350-37837798.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/dnlnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_160k_ade20k/dnl_r101-d8_512x512_160k_ade20k_20200826_183350-ed522c61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_512x1024_80k_cityscapes/emanet_r50-d8_512x1024_80k_cityscapes_20200901_100301-c43fcef1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_512x1024_80k_cityscapes/emanet_r101-d8_512x1024_80k_cityscapes_20200901_100301-2d970745.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_769x769_80k_cityscapes/emanet_r50-d8_769x769_80k_cityscapes_20200901_100301-16f8de52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/emanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_769x769_80k_cityscapes/emanet_r101-d8_769x769_80k_cityscapes_20200901_100301-47a324ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_40k_cityscapes/encnet_r50-d8_512x1024_40k_cityscapes_20200621_220958-68638a47.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_40k_cityscapes/encnet_r101-d8_512x1024_40k_cityscapes_20200621_220933-35e0a3e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_40k_cityscapes/encnet_r50-d8_769x769_40k_cityscapes_20200621_220958-3bcd2884.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_40k_cityscapes/encnet_r101-d8_769x769_40k_cityscapes_20200621_220933-2fafed55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_80k_cityscapes/encnet_r50-d8_512x1024_80k_cityscapes_20200622_003554-fc5c5624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_80k_cityscapes/encnet_r101-d8_512x1024_80k_cityscapes_20200622_003555-1de64bec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_80k_cityscapes/encnet_r50-d8_769x769_80k_cityscapes_20200622_003554-55096dcb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_80k_cityscapes/encnet_r101-d8_769x769_80k_cityscapes_20200622_003555-470ef79d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_80k_ade20k/encnet_r50-d8_512x512_80k_ade20k_20200622_042412-44b46b04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_80k_ade20k/encnet_r101-d8_512x512_80k_ade20k_20200622_101128-dd35e237.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_160k_ade20k/encnet_r50-d8_512x512_160k_ade20k_20200622_101059-b2db95e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/encnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_160k_ade20k/encnet_r101-d8_512x512_160k_ade20k_20200622_073348-7989641f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fastscnn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fast_scnn/fast_scnn_4x8_80k_lr0.12_cityscapes-f5096c79.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_40k_cityscapes/fcn_r50-d8_512x1024_40k_cityscapes_20200604_192608-efe53f0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_40k_cityscapes/fcn_r101-d8_512x1024_40k_cityscapes_20200604_181852-a883d3a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_40k_cityscapes/fcn_r50-d8_769x769_40k_cityscapes_20200606_113104-977b5d02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_40k_cityscapes/fcn_r101-d8_769x769_40k_cityscapes_20200606_113208-7d4ab69c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_512x1024_80k_cityscapes/fcn_r18-d8_512x1024_80k_cityscapes_20201225_021327-6c50f8b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_80k_cityscapes/fcn_r50-d8_512x1024_80k_cityscapes_20200606_113019-03aa804d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_80k_cityscapes/fcn_r101-d8_512x1024_80k_cityscapes_20200606_113038-3fb937eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_769x769_80k_cityscapes/fcn_r18-d8_769x769_80k_cityscapes_20201225_021451-9739d1b8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_80k_cityscapes/fcn_r50-d8_769x769_80k_cityscapes_20200606_195749-f5caeabc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_80k_cityscapes/fcn_r101-d8_769x769_80k_cityscapes_20200606_214354-45cbac68.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_512x1024_80k_cityscapes/fcn_r18b-d8_512x1024_80k_cityscapes_20201225_230143-92c0f445.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_512x1024_80k_cityscapes/fcn_r50b-d8_512x1024_80k_cityscapes_20201225_094221-82957416.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_512x1024_80k_cityscapes/fcn_r101b-d8_512x1024_80k_cityscapes_20201226_160213-4543858f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_769x769_80k_cityscapes/fcn_r18b-d8_769x769_80k_cityscapes_20201226_004430-32d504e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_769x769_80k_cityscapes/fcn_r50b-d8_769x769_80k_cityscapes_20201225_094223-94552d38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_769x769_80k_cityscapes/fcn_r101b-d8_769x769_80k_cityscapes_20201226_170012-82be37e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_40k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes-98d5d1bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_80k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes-98d5d1bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_40k_cityscapes/fcn_d6_r50-d16_769x769_40k_cityscapes-1aab18ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_80k_cityscapes/fcn_d6_r50-d16_769x769_80k_cityscapes-109d88eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_40k_cityscapes/fcn_d6_r101-d16_512x1024_40k_cityscapes-9cf2b450.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_80k_cityscapes/fcn_d6_r101-d16_512x1024_80k_cityscapes-cb336445.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_40k_cityscapes/fcn_d6_r101-d16_769x769_40k_cityscapes-60b114e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_80k_cityscapes/fcn_d6_r101-d16_769x769_80k_cityscapes-e33adc4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_80k_ade20k/fcn_r50-d8_512x512_80k_ade20k_20200614_144016-f8ac5082.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_80k_ade20k/fcn_r101-d8_512x512_80k_ade20k_20200615_014143-bc1809f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_160k_ade20k/fcn_r50-d8_512x512_160k_ade20k_20200615_100713-4edbc3b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_160k_ade20k/fcn_r101-d8_512x512_160k_ade20k_20200615_105816-fd192bd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_20k_voc12aug/fcn_r50-d8_512x512_20k_voc12aug_20200617_010715-52dc5306.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_20k_voc12aug/fcn_r101-d8_512x512_20k_voc12aug_20200617_010842-0bb4e798.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_40k_voc12aug/fcn_r50-d8_512x512_40k_voc12aug_20200613_161222-5e2dbf40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_40k_voc12aug/fcn_r101-d8_512x512_40k_voc12aug_20200613_161240-4c8bcefd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context/fcn_r101-d8_480x480_40k_pascal_context-20210421_154757-b5e97937.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context/fcn_r101-d8_480x480_80k_pascal_context-20210421_163310-4711813f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context_59/fcn_r101-d8_480x480_40k_pascal_context_59_20210415_230724-8cf83682.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fcn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context_59/fcn_r101-d8_480x480_80k_pascal_context_59_20210416_110804-9a6f2c94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/fcn_r101-d8_512x1024_80k_fp16_cityscapes/fcn_r101-d8_512x1024_80k_fp16_cityscapes-50245227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/pspnet_r101-d8_512x1024_80k_fp16_cityscapes/pspnet_r101-d8_512x1024_80k_fp16_cityscapes-ade37931.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/deeplabv3_r101-d8_512x1024_80k_fp16_cityscapes/deeplabv3_r101-d8_512x1024_80k_fp16_cityscapes-bc86dc84.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/fp16/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fp16/deeplabv3plus_r101-d8_512x1024_80k_fp16_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_fp16_cityscapes-cc58bc8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_40k_cityscapes/fcn_hr18s_512x1024_40k_cityscapes_20200601_014216-93db27d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_40k_cityscapes/fcn_hr18_512x1024_40k_cityscapes_20200601_014216-f196fb4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_40k_cityscapes/fcn_hr48_512x1024_40k_cityscapes_20200601_014240-a989b146.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_80k_cityscapes/fcn_hr18s_512x1024_80k_cityscapes_20200601_202700-1462b75d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_80k_cityscapes/fcn_hr18_512x1024_80k_cityscapes_20200601_223255-4e7b345e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_80k_cityscapes/fcn_hr48_512x1024_80k_cityscapes_20200601_202606-58ea95d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_160k_cityscapes/fcn_hr18s_512x1024_160k_cityscapes_20200602_190901-4a0797ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_160k_cityscapes/fcn_hr18_512x1024_160k_cityscapes_20200602_190822-221e4a4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_160k_cityscapes/fcn_hr48_512x1024_160k_cityscapes_20200602_190946-59b7973e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_ade20k/fcn_hr18s_512x512_80k_ade20k_20200614_144345-77fc814a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_ade20k/fcn_hr18_512x512_80k_ade20k_20200614_185145-66f20cb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_ade20k/fcn_hr48_512x512_80k_ade20k_20200614_193946-7ba5258d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_160k_ade20k/fcn_hr18s_512x512_160k_ade20k_20200614_214413-870f65ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_160k_ade20k/fcn_hr18_512x512_160k_ade20k_20200614_214426-ca961836.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_160k_ade20k/fcn_hr48_512x512_160k_ade20k_20200614_214407-a52fc02c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_20k_voc12aug/fcn_hr18s_512x512_20k_voc12aug_20200617_224503-56e36088.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_20k_voc12aug/fcn_hr18_512x512_20k_voc12aug_20200617_224503-488d45f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_20k_voc12aug/fcn_hr48_512x512_20k_voc12aug_20200617_224419-89de05cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_40k_voc12aug/fcn_hr18s_512x512_40k_voc12aug_20200614_000648-4f8d6e7f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_40k_voc12aug/fcn_hr18_512x512_40k_voc12aug_20200613_224401-1b4b76cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_40k_voc12aug/fcn_hr48_512x512_40k_voc12aug_20200613_222111-1b0f18bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context/fcn_hr48_480x480_40k_pascal_context_20200911_164852-667d00b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context/fcn_hr48_480x480_80k_pascal_context_20200911_155322-847a6711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context_59/fcn_hr48_480x480_40k_pascal_context_59_20210410_122738-b808b8b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context_59/fcn_hr48_480x480_80k_pascal_context_59_20210411_003240-3ae7081e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x1024_80k_cityscapes/fcn_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-d24c28c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x1024_80k_cityscapes/pspnet_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-19e81d51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x1024_80k_cityscapes/deeplabv3_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-bef03590.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-d256dd4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x512_160k_ade20k/fcn_m-v2-d8_512x512_160k_ade20k_20200825_214953-c40e1095.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x512_160k_ade20k/pspnet_m-v2-d8_512x512_160k_ade20k_20200825_214953-f5942f7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x512_160k_ade20k/deeplabv3_m-v2-d8_512x512_160k_ade20k_20200825_223255-63986343.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v2/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x512_160k_ade20k/deeplabv3plus_m-v2-d8_512x512_160k_ade20k_20200825_223255-465a01d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_512x1024_320k_cityscapes/lraspp_m-v3-d8_512x1024_320k_cityscapes_20201224_220337-cfe8fb07.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes_20201224_220337-9f29cd72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes/lraspp_m-v3s-d8_512x1024_320k_cityscapes_20201224_223935-61565b34.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/mobilenet_v3/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes_20201224_223935-03daeabb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_40k_cityscapes/nonlocal_r50-d8_512x1024_40k_cityscapes_20200605_210748-c75e81e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_40k_cityscapes/nonlocal_r101-d8_512x1024_40k_cityscapes_20200605_210748-d63729fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_40k_cityscapes/nonlocal_r50-d8_769x769_40k_cityscapes_20200530_045243-82ef6749.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_40k_cityscapes/nonlocal_r101-d8_769x769_40k_cityscapes_20200530_045348-8fe9a9dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_80k_cityscapes/nonlocal_r50-d8_512x1024_80k_cityscapes_20200607_193518-d6839fae.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_80k_cityscapes/nonlocal_r101-d8_512x1024_80k_cityscapes_20200607_183411-32700183.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_80k_cityscapes/nonlocal_r50-d8_769x769_80k_cityscapes_20200607_193506-1f9792f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_80k_cityscapes/nonlocal_r101-d8_769x769_80k_cityscapes_20200607_183428-0e1fa4f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_80k_ade20k/nonlocal_r50-d8_512x512_80k_ade20k_20200615_015801-5ae0aa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_80k_ade20k/nonlocal_r101-d8_512x512_80k_ade20k_20200615_015758-24105919.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_160k_ade20k/nonlocal_r50-d8_512x512_160k_ade20k_20200616_005410-baef45e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_160k_ade20k/nonlocal_r101-d8_512x512_160k_ade20k_20200616_003422-affd0f8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_20k_voc12aug/nonlocal_r50-d8_512x512_20k_voc12aug_20200617_222613-07f2a57c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_20k_voc12aug/nonlocal_r101-d8_512x512_20k_voc12aug_20200617_222615-948c68ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_40k_voc12aug/nonlocal_r50-d8_512x512_40k_voc12aug_20200614_000028-0139d4a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/nonlocal_net/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_40k_voc12aug/nonlocal_r101-d8_512x512_40k_voc12aug_20200614_000028-7e5ff470.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_40k_cityscapes/ocrnet_hr18s_512x1024_40k_cityscapes_20200601_033304-fa2436c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_40k_cityscapes/ocrnet_hr18_512x1024_40k_cityscapes_20200601_033320-401c5bdd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_40k_cityscapes/ocrnet_hr48_512x1024_40k_cityscapes_20200601_033336-55b32491.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_80k_cityscapes/ocrnet_hr18s_512x1024_80k_cityscapes_20200601_222735-55979e63.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_80k_cityscapes/ocrnet_hr18_512x1024_80k_cityscapes_20200614_230521-c2e1dd4a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_80k_cityscapes/ocrnet_hr48_512x1024_80k_cityscapes_20200601_222752-9076bcdf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_160k_cityscapes/ocrnet_hr18s_512x1024_160k_cityscapes_20200602_191005-f4a7af28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_160k_cityscapes/ocrnet_hr18_512x1024_160k_cityscapes_20200602_191001-b9172d0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_160k_cityscapes/ocrnet_hr48_512x1024_160k_cityscapes_20200602_191037-dfbf1b0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b8_cityscapes/ocrnet_r101-d8_512x1024_40k_b8_cityscapes-02ac0f13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b16_cityscapes/ocrnet_r101-d8_512x1024_40k_b16_cityscapes-db500f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_80k_ade20k/ocrnet_hr18s_512x512_80k_ade20k_20200615_055600-e80b62af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_80k_ade20k/ocrnet_hr18_512x512_80k_ade20k_20200615_053157-d173d83b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_80k_ade20k/ocrnet_hr48_512x512_80k_ade20k_20200615_021518-d168c2d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_160k_ade20k/ocrnet_hr18s_512x512_160k_ade20k_20200615_184505-8e913058.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_160k_ade20k/ocrnet_hr18_512x512_160k_ade20k_20200615_200940-d8fcd9d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_160k_ade20k/ocrnet_hr48_512x512_160k_ade20k_20200615_184705-a073726d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_20k_voc12aug/ocrnet_hr18s_512x512_20k_voc12aug_20200617_233913-02b04fcb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_20k_voc12aug/ocrnet_hr18_512x512_20k_voc12aug_20200617_233932-8954cbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_20k_voc12aug/ocrnet_hr48_512x512_20k_voc12aug_20200617_233932-9e82080a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_40k_voc12aug/ocrnet_hr18s_512x512_40k_voc12aug_20200614_002025-42b587ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_40k_voc12aug/ocrnet_hr18_512x512_40k_voc12aug_20200614_015958-714302be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/ocrnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_40k_voc12aug/ocrnet_hr48_512x512_40k_voc12aug_20200614_015958-255bc5ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x1024_80k_cityscapes/pointrend_r50_512x1024_80k_cityscapes_20200711_015821-bb1ff523.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x1024_80k_cityscapes/pointrend_r101_512x1024_80k_cityscapes_20200711_170850-d0ca84be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x512_160k_ade20k/pointrend_r50_512x512_160k_ade20k_20200807_232644-ac3febf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x512_160k_ade20k/pointrend_r101_512x512_160k_ade20k_20200808_030852-8834902a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_40k_cityscapes/psanet_r50-d8_512x1024_40k_cityscapes_20200606_103117-99fac37c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_40k_cityscapes/psanet_r101-d8_512x1024_40k_cityscapes_20200606_001418-27b9cfa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_40k_cityscapes/psanet_r50-d8_769x769_40k_cityscapes_20200530_033717-d5365506.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_40k_cityscapes/psanet_r101-d8_769x769_40k_cityscapes_20200530_035107-997da1e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_80k_cityscapes/psanet_r50-d8_512x1024_80k_cityscapes_20200606_161842-ab60a24f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_80k_cityscapes/psanet_r101-d8_512x1024_80k_cityscapes_20200606_161823-0f73a169.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_80k_cityscapes/psanet_r50-d8_769x769_80k_cityscapes_20200606_225134-fe42f49e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_80k_cityscapes/psanet_r101-d8_769x769_80k_cityscapes_20200606_214550-7665827b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_80k_ade20k/psanet_r50-d8_512x512_80k_ade20k_20200614_144141-835e4b97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_80k_ade20k/psanet_r101-d8_512x512_80k_ade20k_20200614_185117-1fab60d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_160k_ade20k/psanet_r50-d8_512x512_160k_ade20k_20200615_161258-148077dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_160k_ade20k/psanet_r101-d8_512x512_160k_ade20k_20200615_161537-dbfa564c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_20k_voc12aug/psanet_r50-d8_512x512_20k_voc12aug_20200617_102413-2f1bbaa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_20k_voc12aug/psanet_r101-d8_512x512_20k_voc12aug_20200617_110624-946fef11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_40k_voc12aug/psanet_r50-d8_512x512_40k_voc12aug_20200613_161946-f596afb5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/psanet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_40k_voc12aug/psanet_r101-d8_512x512_40k_voc12aug_20200613_161946-1f560f9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_40k_cityscapes/pspnet_r50-d8_512x1024_40k_cityscapes_20200605_003338-2966598c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_40k_cityscapes/pspnet_r101-d8_512x1024_40k_cityscapes_20200604_232751-467e7cf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_40k_cityscapes/pspnet_r50-d8_769x769_40k_cityscapes_20200606_112725-86638686.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_40k_cityscapes/pspnet_r101-d8_769x769_40k_cityscapes_20200606_112753-61c6f5be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x1024_80k_cityscapes/pspnet_r18-d8_512x1024_80k_cityscapes_20201225_021458-09ffa746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_80k_cityscapes/pspnet_r50-d8_512x1024_80k_cityscapes_20200606_112131-2376f12b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_80k_cityscapes/pspnet_r101-d8_512x1024_80k_cityscapes_20200606_112211-e1e1100f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_769x769_80k_cityscapes/pspnet_r18-d8_769x769_80k_cityscapes_20201225_021458-3deefc62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_80k_cityscapes/pspnet_r50-d8_769x769_80k_cityscapes_20200606_210121-5ccf03dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_80k_cityscapes/pspnet_r101-d8_769x769_80k_cityscapes_20200606_225055-dba412fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_512x1024_80k_cityscapes/pspnet_r18b-d8_512x1024_80k_cityscapes_20201226_063116-26928a60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_512x1024_80k_cityscapes/pspnet_r50b-d8_512x1024_80k_cityscapes_20201225_094315-6344287a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_512x1024_80k_cityscapes/pspnet_r101b-d8_512x1024_80k_cityscapes_20201226_170012-3a4d38ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_769x769_80k_cityscapes/pspnet_r18b-d8_769x769_80k_cityscapes_20201226_080942-bf98d186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_769x769_80k_cityscapes/pspnet_r50b-d8_769x769_80k_cityscapes_20201225_094316-4c643cf6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_769x769_80k_cityscapes/pspnet_r101b-d8_769x769_80k_cityscapes_20201226_171823-f0e7c293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_ade20k/pspnet_r50-d8_512x512_80k_ade20k_20200615_014128-15a8b914.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_ade20k/pspnet_r101-d8_512x512_80k_ade20k_20200614_031423-b6e782f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_160k_ade20k/pspnet_r50-d8_512x512_160k_ade20k_20200615_184358-1890b0bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_160k_ade20k/pspnet_r101-d8_512x512_160k_ade20k_20200615_100650-967c316f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_20k_voc12aug/pspnet_r50-d8_512x512_20k_voc12aug_20200617_101958-ed5dfbd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_20k_voc12aug/pspnet_r101-d8_512x512_20k_voc12aug_20200617_102003-4aef3c9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_40k_voc12aug/pspnet_r50-d8_512x512_40k_voc12aug_20200613_161222-ae9c1b8c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_40k_voc12aug/pspnet_r101-d8_512x512_40k_voc12aug_20200613_161222-bc933b18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context/pspnet_r101-d8_480x480_40k_pascal_context_20200911_211210-bf0f5d7c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context/pspnet_r101-d8_480x480_80k_pascal_context_20200911_190530-c86d6233.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context_59/pspnet_r101-d8_480x480_40k_pascal_context_59_20210416_114524-86d44cd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/pspnet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context_59/pspnet_r101-d8_480x480_80k_pascal_context_59_20210416_114418-fa6caaa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x512_160k_ade20k/fcn_s101-d8_512x512_160k_ade20k_20200807_145416-d3160329.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x1024_80k_cityscapes/fcn_s101-d8_512x1024_80k_cityscapes_20200807_140631-f8d155b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x1024_80k_cityscapes/pspnet_s101-d8_512x1024_80k_cityscapes_20200807_140631-c75f3b99.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x1024_80k_cityscapes/deeplabv3_s101-d8_512x1024_80k_cityscapes_20200807_144429-b73c4270.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x1024_80k_cityscapes/deeplabv3plus_s101-d8_512x1024_80k_cityscapes_20200807_144429-1239eb43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x512_160k_ade20k/pspnet_s101-d8_512x512_160k_ade20k_20200807_145416-a6daa92a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x512_160k_ade20k/deeplabv3_s101-d8_512x512_160k_ade20k_20200807_144503-17ecabe5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x512_160k_ade20k/deeplabv3plus_s101-d8_512x512_160k_ade20k_20200807_144503-27b26226.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x1024_80k_cityscapes/fpn_r50_512x1024_80k_cityscapes_20200717_021437-94018a0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x1024_80k_cityscapes/fpn_r101_512x1024_80k_cityscapes_20200717_012416-c5800d4c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x512_160k_ade20k/fpn_r50_512x512_160k_ade20k_20200718_131734-5b5a6ab9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/sem_fpn/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x512_160k_ade20k/fpn_r101_512x512_160k_ade20k_20200718_131734-306b5004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_64x64_40k_drive/fcn_unet_s5-d16_64x64_40k_drive_20201223_191051-26cee593.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_64x64_40k_drive/pspnet_unet_s5-d16_64x64_40k_drive_20201227_181818-aac73387.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_64x64_40k_drive/deeplabv3_unet_s5-d16_64x64_40k_drive_20201226_094047-0671ff20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_stare/fcn_unet_s5-d16_128x128_40k_stare_20201223_191051-6ea7cfda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_stare/pspnet_unet_s5-d16_128x128_40k_stare_20201227_181818-3c2923c4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_stare/deeplabv3_unet_s5-d16_128x128_40k_stare_20201226_094047-93dcb93c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_chase_db1/fcn_unet_s5-d16_128x128_40k_chase_db1_20201223_191051-95852f45.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_chase_db1/pspnet_unet_s5-d16_128x128_40k_chase_db1_20201227_181818-68d4e609.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_chase_db1/deeplabv3_unet_s5-d16_128x128_40k_chase_db1_20201226_094047-4c5aefa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_256x256_40k_hrf/fcn_unet_s5-d16_256x256_40k_hrf_20201223_173724-df3ec8c4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_256x256_40k_hrf/pspnet_unet_s5-d16_256x256_40k_hrf_20201227_181818-fdb7e29b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/unet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_256x256_40k_hrf/deeplabv3_unet_s5-d16_256x256_40k_hrf_20201226_094047-3a1fdf85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_40k_cityscapes/upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_40k_cityscapes/upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_40k_cityscapes/upernet_r101_512x1024_40k_cityscapes_20200605_094933-ebce3b10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_40k_cityscapes/upernet_r50_769x769_40k_cityscapes_20200530_033048-92d21539.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_40k_cityscapes/upernet_r101_769x769_40k_cityscapes_20200530_040819-83c95d01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_80k_cityscapes/upernet_r50_512x1024_80k_cityscapes_20200607_052207-848beca8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_80k_cityscapes/upernet_r101_512x1024_80k_cityscapes_20200607_002403-f05f2345.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_80k_cityscapes/upernet_r50_769x769_80k_cityscapes_20200607_005107-82ae7d15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_80k_cityscapes/upernet_r101_769x769_80k_cityscapes_20200607_001014-082fc334.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_80k_ade20k/upernet_r50_512x512_80k_ade20k_20200614_144127-ecc8377b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_80k_ade20k/upernet_r101_512x512_80k_ade20k_20200614_185117-32e4db94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_160k_ade20k/upernet_r50_512x512_160k_ade20k_20200615_184328-8534de8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_160k_ade20k/upernet_r101_512x512_160k_ade20k_20200615_161951-91b32684.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_20k_voc12aug/upernet_r50_512x512_20k_voc12aug_20200617_165330-5b5890a7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_20k_voc12aug/upernet_r101_512x512_20k_voc12aug_20200617_165629-f14e7f27.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_40k_voc12aug/upernet_r50_512x512_40k_voc12aug_20200613_162257-ca9bcc6b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/upernet/metafile.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_40k_voc12aug/upernet_r101_512x512_40k_voc12aug_20200613_163549-e26476ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_512x512_80k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_ln_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-b16_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_512x512_80k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_ln_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/configs/vit/upernet_deit-s16_mln_512x512_160k_ade20k.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/DeeplabV3_for_Pytorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/ENet/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/ENet/public_address_statement.md index 918f3c249097791324f3d05047d894acb8cb1c1c..376ecca459d8270dc6921b91adff6ee877a20e3d 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/ENet/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/ENet/public_address_statement.md @@ -1,50 +1,32 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------------------------------------------------------------------|-------------------------------------------|---------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/ade20k.py | ENet/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/ADEChallengeData2016.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/ade20k.py | ENet/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/release_test.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | ENet/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | ENet/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | ENet/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/pascal_voc.py | ENet/core/data/downloader/pascal_voc.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/sbu_shadow.py | ENet/core/data/downloader/sbu_shadow.py | http://www3.cs.stonybrook.edu/~cvl/content/datasets/shadow_db/SBU-shadow.zip | 下载数据集 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | ENet/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | ENet/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | ENet/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | ENet/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnetv1b.py | ENet/core/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnext.py | ENet/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnext.py | ENet/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/vgg.py | ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/model_store.py | ENet/core/models/model_store.py | https://hangzh.s3.amazonaws.com/ | 仓库地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/filesystem.py | ENet/core/utils/filesystem.py | https://github.com/zhreshold/cocoapi.git#subdirectory=PythonAPI | 下载依赖 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/densenet.py | ENet/core/models/base_models/densenet.py | https://arxiv.org/pdf/1608.06993.pdf | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/parallel.py | ENet/core/utils/parallel.py | https://github.com/zhanghang1989/PyTorch-Encoding/blob/master/encoding/parallel.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/filesystem.py | ENet/core/utils/filesystem.py | http://github.com/user/repo/tarball/master/egginfo=xxx | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/logger.py | ENet/core/utils/logger.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/utils/logger.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/nn/jpu.py | ENet/core/nn/jpu.py | https://github.com/wuhuikai/FastFCN/blob/master/encoding/nn/customize.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/lr_scheduler.py | ENet/core/utils/lr_scheduler.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/solver/lr_scheduler.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/loss.py | ENet/core/utils/loss.py | https://github.com/zhanghang1989/PyTorch-Encoding/blob/master/encoding/nn/loss.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/distributed.py | ENet/core/utils/distributed.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/data/samplers/distributed.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/utils/distributed.py | ENet/core/utils/distributed.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/utils/comm.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/stuff_annotations_trainval2017.zip | 数据集地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/data/downloader/mscoco.py | ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/test2017.zip | 数据集地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch/core/models/base_models/resnet.py | ENet/core/models/base_models/resnet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/ADEChallengeData2016.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/release_test.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/mscoco.py | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/pascal_voc.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/data/downloader/sbu_shadow.py | http://www3.cs.stonybrook.edu/~cvl/content/datasets/shadow_db/SBU-shadow.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/resnext.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/base_models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ENet/core/models/model_store.py | https://hangzh.s3.amazonaws.com/ | 相关说明 | diff --git a/PyTorch/contrib/cv/semantic_segmentation/FCN-res18_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/FCN-res18_for_Pytorch/public_address_statement.md index 2d7cd18744aa141b6559e0e687c42286f816219f..60573250ff9a97166edeed99979d8dae0f9fa918 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/FCN-res18_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/FCN-res18_for_Pytorch/public_address_statement.md @@ -1,106 +1,3 @@ -|类型|开源代码地址|文件名|公网IP地址/公网URL地址/域名/邮箱地址|用途说明| -|--------|-------------------------------------------------------|--------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -|开源代码引入|https://github.com/open-mmlab/mmsegmentation|setup.py|https://github.com/open-mmlab/mmsegmentation|源码实现| -|开源代码引入|http://setuptools.readthedocs.io/en/latest/setuptools.html|setup.py|http://setuptools.readthedocs.io/en/latest/setuptools.html|源码实现| -|开源代码引入|https://github.com/openai/CLIP|mmseg/utils/tokenizer.py|https://github.com/openai/CLIP|源码实现| -|开源代码引入|https://arxiv.org/abs/1706.03762|mmseg/models/utils/self_attention_block.py|https://arxiv.org/abs/1706.03762|参考论文地址| -|开源代码引入|https://github.com/facebookresearch/ConvNeXt/blob/d1fa8f6fef0a165b27399986cc2bdacc92777e40/models/convnext.py|mmseg/models/utils/san_layers.py|https://github.com/facebookresearch/ConvNeXt/blob/d1fa8f6fef0a165b27399986cc2bdacc92777e40/models/convnext.py|源码实现| -|开源代码引入|https://github.com/pytorch/pytorch/blob/main/torch/nn/functional.py|mmseg/models/utils/san_layers.py|https://github.com/pytorch/pytorch/blob/main/torch/nn/functional.py|源码实现| -|开源代码引入|https://github.com/MendelXu/SAN/blob/main/san/model/attn_helper.py|mmseg/models/utils/san_layers.py|https://github.com/MendelXu/SAN/blob/main/san/model/attn_helper.py|源码实现| -|开源代码引入|https://arxiv.org/abs/2206.02066|mmseg/models/utils/ppm.py|https://arxiv.org/abs/2206.02066|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2101.06085|mmseg/models/utils/ppm.py|https://arxiv.org/abs/2101.06085|参考论文地址| -|开源代码引入|https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py|mmseg/models/utils/make_divisible.py|https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py|源码实现| -|开源代码引入|https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html|mmseg/models/utils/embed.py|https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html|源码实现| -|开源代码引入|https://arxiv.org/abs/1512.03385|mmseg/models/utils/basic_block.py|https://arxiv.org/abs/1512.03385|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1512.03385|mmseg/models/utils/basic_block.py|https://arxiv.org/abs/1512.03385|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2012.15840|mmseg/models/necks/mla_neck.py|https://arxiv.org/abs/2012.15840|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1903.11816|mmseg/models/necks/jpu.py|https://arxiv.org/abs/1903.11816|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1704.08545|mmseg/models/necks/ic_neck.py|https://arxiv.org/abs/1704.08545|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1612.03144|mmseg/models/necks/fpn.py|https://arxiv.org/abs/1612.03144|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1706.05721|mmseg/models/losses/tversky_loss.py|https://arxiv.org/abs/1706.05721|参考论文地址| -|开源代码引入|https://github.com/JunMa11/SegLoss/blob/master/losses_pytorch/dice_loss.py|mmseg/models/losses/tversky_loss.py|https://github.com/JunMa11/SegLoss/blob/master/losses_pytorch/dice_loss.py|源码实现| -|开源代码引入|https://github.com/XuJiacong/PIDNet/blob/main/utils/criterion.py|mmseg/models/losses/ohem_cross_entropy_loss.py|https://github.com/XuJiacong/PIDNet/blob/main/utils/criterion.py|源码实现| -|开源代码引入|https://arxiv.org/abs/1705.08790|mmseg/models/losses/lovasz_loss.py|https://arxiv.org/abs/1705.08790|参考论文地址| -|开源代码引入|https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytor|mmseg/models/losses/lovasz_loss.py|https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytor|源码实现| -|开源代码引入|https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence|mmseg/models/losses/kldiv_loss.py|https://en.wikipedia.org/wiki/Kullback%E2%80%93Leibler_divergence|源码实现| -|开源代码引入|http://proceedings.mlr.press/v121/ma20b.html|mmseg/models/losses/huasdorff_distance_loss.py|http://proceedings.mlr.press/v121/ma20b.html|源码实现| -|开源代码引入|https://github.com/JunMa11/SegWithDistMap/blob/|mmseg/models/losses/huasdorff_distance_loss.py|https://github.com/JunMa11/SegWithDistMap/blob/|源码实现| -|开源代码引入|https://arxiv.org/abs/1708.02002|mmseg/models/losses/focal_loss.py|https://arxiv.org/abs/1708.02002|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1708.02002|mmseg/models/losses/focal_loss.py|https://arxiv.org/abs/1708.02002|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1708.02002|mmseg/models/losses/focal_loss.py|https://arxiv.org/abs/1708.02002|参考论文地址| -|开源代码引入|https://github.com/open-mmlab/mmdetection|mmseg/models/losses/focal_loss.py|https://github.com/open-mmlab/mmdetection|源码实现| -|开源代码引入|https://arxiv.org/abs/1606.04797|mmseg/models/losses/dice_loss.py|https://arxiv.org/abs/1606.04797|参考论文地址| -|开源代码引入|https://github.com/pytorch/pytorch/blob/56b43f4fec1f76953f15a627694d4bba34588969/torch/nn/functional.py|mmseg/models/losses/cross_entropy_loss.py|https://github.com/pytorch/pytorch/blob/56b43f4fec1f76953f15a627694d4bba34588969/torch/nn/functional.py|源码实现| -|开源代码引入|https://github.com/XuJiacong/PIDNet/blob/main/utils/criterion.py|mmseg/models/losses/boundary_loss.py|https://github.com/XuJiacong/PIDNet/blob/main/utils/criterion.py|源码实现| -|开源代码引入|https://arxiv.org/abs/2303.02153|mmseg/models/decode_heads/vpd_depth_head.py|https://arxiv.org/abs/2303.02153|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1807.10221|mmseg/models/decode_heads/uper_head.py|https://arxiv.org/abs/1807.10221|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2104.13188|mmseg/models/decode_heads/stdc_head.py|https://arxiv.org/abs/2104.13188|参考论文地址| -|开源代码引入|https://arxiv.org/pdf/2012.15840.pdf|mmseg/models/decode_heads/setr_up_head.py|https://arxiv.org/pdf/2012.15840.pdf|参考论文地址| -|开源代码引入|https://arxiv.org/pdf/2012.15840.pdf|mmseg/models/decode_heads/setr_mla_head.py|https://arxiv.org/pdf/2012.15840.pdf|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1902.04502|mmseg/models/decode_heads/sep_fcn_head.py|https://arxiv.org/abs/1902.04502|参考论文地址| -|开源代码引入|https://github.com/MendelXu/SAN/blob/main/san/model/side_adapter/side_adapter.py|mmseg/models/decode_heads/san_head.py|https://github.com/MendelXu/SAN/blob/main/san/model/side_adapter/side_adapter.py|源码实现| -|开源代码引入|https://arxiv.org/abs/2302.12242|mmseg/models/decode_heads/san_head.py|https://arxiv.org/abs/2302.12242|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1612.01105|mmseg/models/decode_heads/psp_head.py|https://arxiv.org/abs/1612.01105|参考论文地址| -|开源代码引入|https://hszhao.github.io/papers/eccv18_psanet.pdf|mmseg/models/decode_heads/psa_head.py|https://hszhao.github.io/papers/eccv18_psanet.pdf|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1909.11065|mmseg/models/decode_heads/ocr_head.py|https://arxiv.org/abs/1909.11065|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1711.07971|mmseg/models/decode_heads/nl_head.py|https://arxiv.org/abs/1711.07971|参考论文地址| -|开源代码引入|https://arxiv.org/pdf/2107.06278|mmseg/models/decode_heads/maskformer_head.py|https://arxiv.org/pdf/2107.06278|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2112.01527|mmseg/models/decode_heads/mask2former_head.py|https://arxiv.org/abs/2112.01527|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1907.12273|mmseg/models/decode_heads/isa_head.py|https://arxiv.org/abs/1907.12273|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1901.02446|mmseg/models/decode_heads/fpn_head.py|https://arxiv.org/abs/1901.02446|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1411.4038|mmseg/models/decode_heads/fcn_head.py|https://arxiv.org/abs/1411.4038|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1803.08904|mmseg/models/decode_heads/enc_head.py|https://arxiv.org/abs/1803.08904|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1907.13426|mmseg/models/decode_heads/ema_head.py|https://arxiv.org/abs/1907.13426|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2103.13413|mmseg/models/decode_heads/dpt_head.py|https://arxiv.org/abs/2103.13413|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2006.06668|mmseg/models/decode_heads/dnl_head.py|https://arxiv.org/abs/2006.06668|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1809.02983|mmseg/models/decode_heads/da_head.py|https://arxiv.org/abs/1809.02983|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1811.11721|mmseg/models/decode_heads/cc_head.py|https://arxiv.org/abs/1811.11721|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1706.05587|mmseg/models/decode_heads/aspp_head.py|https://arxiv.org/abs/1706.05587|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2303.02153|mmseg/models/backbones/vpd.py|https://arxiv.org/abs/2303.02153|参考论文地址| -|开源代码引入|https://github.com/wl-zhao/VPD/blob/main/vpd/models.py|mmseg/models/backbones/vpd.py|https://github.com/wl-zhao/VPD/blob/main/vpd/models.py|源码实现| -|开源代码引入|https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py|mmseg/models/backbones/vit.py|https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py|源码实现| -|开源代码引入|https://arxiv.org/abs/2010.11929|mmseg/models/backbones/vit.py|https://arxiv.org/abs/2010.11929|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1505.04597|mmseg/models/backbones/unet.py|https://arxiv.org/abs/1505.04597|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1512.03385|mmseg/models/backbones/twins.py|https://arxiv.org/abs/1512.03385|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1512.03385|mmseg/models/backbones/twins.py|https://arxiv.org/abs/1512.03385|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2102.10882|mmseg/models/backbones/twins.py|https://arxiv.org/abs/2102.10882|参考论文地址| -|开源代码引入|https://github.com/rwightman/pytorch-image-models|mmseg/models/backbones/timm_backbone.py|https://github.com/rwightman/pytorch-image-models|源码实现| -|开源代码引入|https://github.com/microsoft/Swin-Transformer|mmseg/models/backbones/swin.py|https://github.com/microsoft/Swin-Transformer|源码实现| -|开源代码引入|https://arxiv.org/abs/2103.14030|mmseg/models/backbones/swin.py|https://arxiv.org/abs/2103.14030|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2104.13188|mmseg/models/backbones/stdc.py|https://arxiv.org/abs/2104.13188|参考论文地址| -|开源代码引入|https://github.com/MichaelFan01/STDC-Seg|mmseg/models/backbones/stdc.py|https://github.com/MichaelFan01/STDC-Seg|源码实现| -|开源代码引入|https://arxiv.org/abs/1611.05431|mmseg/models/backbones/resnext.py|https://arxiv.org/abs/1611.05431|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1812.01187|mmseg/models/backbones/resnet.py|https://arxiv.org/abs/1812.01187|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1512.03385|mmseg/models/backbones/resnet.py|https://arxiv.org/abs/1512.03385|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2004.08955|mmseg/models/backbones/resnest.py|https://arxiv.org/abs/2004.08955|参考论文地址| -|开源代码引入|https://github.com/XuJiacong/PIDNet|mmseg/models/backbones/pidnet.py|https://github.com/XuJiacong/PIDNet|源码实现| -|开源代码引入|https://arxiv.org/abs/2206.02066|mmseg/models/backbones/pidnet.py|https://arxiv.org/abs/2206.02066|参考论文地址| -|开源代码引入|https://github.com/visual-attention-network/segnext|mmseg/models/backbones/mscan.py|https://github.com/visual-attention-network/segnext|源码实现| -|开源代码引入|https://arxiv.org/abs/2209.08575|mmseg/models/backbones/mscan.py|https://arxiv.org/abs/2209.08575|参考论文地址| -|开源代码引入|https://github.com/visual-attention-network/segnext|mmseg/models/backbones/mscan.py|https://github.com/visual-attention-network/segnext|源码实现| -|开源代码引入|https://ieeexplore.ieee.org/document/9008835|mmseg/models/backbones/mobilenet_v3.py|https://ieeexplore.ieee.org/document/9008835|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1801.04381|mmseg/models/backbones/mobilenet_v2.py|https://arxiv.org/abs/1801.04381|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2105.15203|mmseg/models/backbones/mit.py|https://arxiv.org/abs/2105.15203|参考论文地址| -|开源代码引入|https://github.com/pytorch/pytorch/issues/37583|mmseg/models/backbones/mit.py|https://github.com/pytorch/pytorch/issues/37583|源码实现| -|开源代码引入|https://github.com/open-mmlab/mmcv/pull/1418|mmseg/models/backbones/mit.py|https://github.com/open-mmlab/mmcv/pull/1418|源码实现| -|开源代码引入|https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py|mmseg/models/backbones/mae.py|https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py|源码实现| -|开源代码引入|https://github.com/microsoft/unilm/blob/master/beit/modeling_pretrain.py|mmseg/models/backbones/mae.py|https://github.com/microsoft/unilm/blob/master/beit/modeling_pretrain.py|源码实现| -|开源代码引入|https://arxiv.org/abs/1704.08545|mmseg/models/backbones/icnet.py|https://arxiv.org/abs/1704.08545|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1904.04514|mmseg/models/backbones/hrnet.py|https://arxiv.org/abs/1904.04514|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1902.04502|mmseg/models/backbones/fast_scnn.py|https://arxiv.org/abs/1902.04502|参考论文地址| -|开源代码引入|https://ieeexplore.ieee.org/document/8063438|mmseg/models/backbones/erfnet.py|https://ieeexplore.ieee.org/document/8063438|参考论文地址| -|开源代码引入|https://github.com/ydhongHIT/DDRNet|mmseg/models/backbones/ddrnet.py|https://github.com/ydhongHIT/DDRNet|源码实现| -|开源代码引入|http://arxiv.org/abs/2101.06085|mmseg/models/backbones/ddrnet.py|http://arxiv.org/abs/2101.06085|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1811.08201|mmseg/models/backbones/cgnet.py|https://arxiv.org/abs/1811.08201|参考论文地址| -|开源代码引入|https://arxiv.org/abs/2004.02147|mmseg/models/backbones/bisenetv2.py|https://arxiv.org/abs/2004.02147|参考论文地址| -|开源代码引入|https://arxiv.org/abs/1808.00897|mmseg/models/backbones/bisenetv1.py|https://arxiv.org/abs/1808.00897|参考论文地址| -|开源代码引入|https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py|mmseg/models/backbones/beit.py|https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py|源码实现| -|开源代码引入|https://github.com/microsoft/unilm/blob/master/beit/semantic_segmentation/mmcv_custom/checkpoint.py|mmseg/models/backbones/beit.py|https://github.com/microsoft/unilm/blob/master/beit/semantic_segmentation/mmcv_custom/checkpoint.py|源码实现| -|开源代码引入|https://mmengine.readthedocs.io/en/latest/api/fileio.htm|mmseg/engine/hooks/visualization_hook.py|https://mmengine.readthedocs.io/en/latest/api/fileio.htm|源码实现| -|开源代码引入|https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/utils/class_names.py|mmseg/apis/mmseg_inferencer.py|https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/utils/class_names.py|源码实现| -|开源代码引入|https://github.com/open-mmlab/mmsegmentation/blob/main/configs/fcn/metafile.yaml|mmseg/apis/mmseg_inferencer.py|https://github.com/open-mmlab/mmsegmentation/blob/main/configs/fcn/metafile.yaml|源码实现| -|开源代码引入|https://github.com/open-mmlab/mmengine/blob/main/docs/en/tutorials/registry.md|mmseg/utils/set_env.py|https://github.com/open-mmlab/mmengine/blob/main/docs/en/tutorials/registry.md|源码实现| -|开源代码引入|https://github.com/openai/CLIP.git|requirements/docs.txt|https://github.com/openai/CLIP.git|源码实现| -|开源代码引入|https://github.com/CompVis/stable-diffusion|requirements/optional.txt|https://github.com/CompVis/stable-diffusion|源码实现| -|开源代码引入|https://github.com/CompVis/taming-transformers.git|requirements/optional.txt|https://github.com/CompVis/taming-transformers.git|源码实现| -|开源代码引入|https://github.com/open-mmlab/pytorch_sphinx_theme.git|requirements/optional.txt|https://github.com/open-mmlab/pytorch_sphinx_theme.git|源码实现| \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|---------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN-res18_for_Pytorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/FCN8s/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/FCN8s/public_address_statement.md index 55e740a939aa880839e3b503df25d8e34a83de77..0d29cdab847b6a6ec4ffcc0512da63bfc250252a 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/FCN8s/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/FCN8s/public_address_statement.md @@ -1,183 +1,80 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-----------------------------------------------------------|-----------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 下载预训练模型 | -| 开发引入 | / | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/setup.py | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | openmmlab@gmail.com | 邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/setup.py | FCN8s/mmcv_replace/model_zoo/open_mmlab.json | http://github.com/open-mmlab/mmsegmentation | 源码地址 | -| 开发引入 | / | FCN8s/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | FCN8s/mmcv_replace/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 下载依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/csrc/parrots/box_iou_rotated.cpp | / | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开发引入 | / | FCN8s/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/roi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/saconv.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/cc_attention_cuda.cu | https://github.com/LikeLy-Journey/SegmenTron/blob/master/segmentron/modules/csrc/criss_cross_attention/ca_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/blob/e5e9a539f550f07ec156812484e8d4f33fb91f88/onnx/onnx.proto#L461 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/csrc/modulated_deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开发引入 | / | FCN8s/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 论文地址 | -| 开发引入 | / | FCN8s/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 论文地址 | -| 开发引入 | / | FCN8s/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开发引入 | / | FCN8s/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/nms.py | https://github.com/pytorch/vision/blob | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/csrc/deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/symbolic.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2417 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/box_iou_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/psamask.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/psa_head.py | https://hszhao.github.io/papers/eccv18_psanet.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/carafe_cuda_kernel.cuh | https://devblogs.nvidia.com/efficient-matrix-transpose-cuda-cc/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch/blob/75ee5756715e7161314ce037474843b68f69fc04/torch/onnx/symbolic_helper.py#L375 | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn","sjqian@cse.cuhk.edu.hk","yuliu@ee.cuhk.edu.hk | 邮箱地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/nms_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/nms_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/parrots/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/cnn/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/dm_head.py | https://openaccess.thecvf.com/content_ICCV_2019/papers/ | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2613 | 相关说明 | -| 开发引入 | / | FCN8s/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/modulated_deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开发引入 | / | FCN8s/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开发引入 | / | FCN8s/mmcv_replace/ops/csrc/box_iou_rotated_utils.hpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 论文地址 | -| 开发引入 | / | FCN8s/mmcv_replace/onnx/simplify/core.py | https://github.com/daquexian/onnx-simplifier | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git | FCN8s/mmcv_replace/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | mmedit/mobilenet_v2": "https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FCN8s/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/public_address_statement.md index 3e8aa489ec98319b2bdcb93e666f372f0a1a462f..7e0fe02ee71f24e85d36c6b807d8f6bbcdc987c3 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/public_address_statement.md @@ -1,7 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------|------------------------------------|------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/chenxi116/DeepLabv3.pytorch.git | FFHQ_ID2978_for_Pytorch/deeplab.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/chenxi116/DeepLabv3.pytorch.git | FFHQ_ID2978_for_Pytorch/deeplab.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/chenxi116/DeepLabv3.pytorch.git | FFHQ_ID2978_for_Pytorch/deeplab.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/chenxi116/DeepLabv3.pytorch.git | FFHQ_ID2978_for_Pytorch/run_deeplab.py | https://drive.google.com/uc?id=1oRGgrI4KNdefbWVpw0rRkEP1gbJIRokM | 文件下载 | -| 开源代码引入 | https://github.com/chenxi116/DeepLabv3.pytorch.git | FFHQ_ID2978_for_Pytorch/run_deeplab.py | https://drive.google.com/uc?id=1w2XjDywFr2NjuUWaLQDRktH7VwIfuNlY | 文件下载 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/deeplab.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/deeplab.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FFHQ_ID2978_for_Pytorch/deeplab.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/public_address_statement.md index d3b3f7b9c9c1c56d20236b7a06feedb2bd5b6f26..fa620b00ada76e9ae9c09d24f30abcca2ac3376c 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/public_address_statement.md @@ -1,43 +1,17 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------------------------------------------|---------------------------------------------------|-------------------------------------------------------------------------------------------------------|---------| -| 开发引入 | / | FastSCNN/requirements.txt | http://github.com/jinfagang/DCNv2_latest.git@c14535d60cf781927dbce96e4f809630d424c6b5#egg=DCNv2 | 下载依赖 | -| 开发引入 | / | FastSCNN/requirements.txt | https://github.com/svenkreiss/poseval.git@3128c5cbcf90946e5164ff438ad651e113e64613 | 下载依赖 | -| 开发引入 | / | FastSCNN/requirements.txt | https://github.com/rwightman/pytorch-image-models.git@5aca7c01e53e4a2d4c8773d90dd86ed23574d755 | 下载依赖 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/ade20k.py | FastSCNN/segmentron/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/ADEChallengeData2016.zip | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/ade20k.py | FastSCNN/segmentron/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/release_test.zip | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/mscoco.py | FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/zips/train2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/mscoco.py | FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/mscoco.py | FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/zips/val2017.zip | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/pascal_voc.py | FastSCNN/segmentron/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/pascal_voc.py | FastSCNN/segmentron/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/pascal_voc.py | FastSCNN/segmentron/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/pascal_voc.py | FastSCNN/segmentron/data/downloader/pascal_voc.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/sbu_shadow.py | FastSCNN/segmentron/data/downloader/sbu_shadow.py | http://www3.cs.stonybrook.edu/~cvl/content/datasets/shadow_db/SBU-shadow.zip | 下载数据集 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://github.com/LikeLy-Journey/SegmenTron/releases/download/v0.1.0/resnet50-25c4b509.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://github.com/LikeLy-Journey/SegmenTron/releases/download/v0.1.0/resnet101-2a57e44d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://github.com/LikeLy-Journey/SegmenTron/releases/download/v0.1.0/resnet152-0d43d698.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://github.com/LikeLy-Journey/SegmenTron/releases/download/v0.1.0/tf-xception65-270e81cf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://github.com/LikeLy-Journey/SegmenTron/releases/download/v0.1.0/hrnet-w18-small-v1-08f8ae64.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/backbones/build.py | FastSCNN/segmentron/models/backbones/build.py | https://github.com/LikeLy-Journey/SegmenTron/releases/download/v0.1.0/mobilenetV2-15498621.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/setup.py | FastSCNN/setup.py | https://github.com/LikeLy-Journey/SegmenTron | 源码地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/fpenet.py | FastSCNN/segmentron/models/fpenet.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/unet.py | FastSCNN/segmentron/models/unet.py | https://github.com/xiaopeng-liao/Pytorch-UNet/commit/8ebac70e633bac59fc22bb5195e513d5832fb3bd | 相关说明 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/modules/sync_bn/syncbn.py | FastSCNN/segmentron/modules/sync_bn/syncbn.py | zhang.hang@rutgers.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/mscoco.py | FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/zips/test2017.zip | 数据集地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/solver/lovasz_losses.py | FastSCNN/segmentron/solver/lovasz_losses.py | https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytorch/lovasz_losses.py | 源码实现 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/modules/batch_norm.py | FastSCNN/segmentron/modules/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/modules/sync_bn/syncbn.py | FastSCNN/segmentron/modules/sync_bn/syncbn.py | https://github.com/zhanghang1989/PyTorch-Encoding/blob/master/encoding/nn/syncbn.py | 源码实现 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/solver/lr_scheduler.py | FastSCNN/segmentron/solver/lr_scheduler.py | https://arxiv.org/abs/1706.02677 | 论文地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/utils/distributed.py | FastSCNN/segmentron/utils/distributed.py | https://github.com/facebookresearch/maskrcnn-benchmark | 源码实现 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/danet.py | FastSCNN/segmentron/models/danet.py | https://arxiv.org/abs/1809.02983.pdf | 论文地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/data/downloader/mscoco.py | FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/stuff_annotations_trainval2017.zip | 数据集地址 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/utils/parallel.py | FastSCNN/segmentron/utils/parallel.py | https://github.com/zhanghang1989/PyTorch-Encoding/blob/master/encoding/parallel.py | 源码实现 | -| 开源代码引入 | https://github.com/LikeLy-Journey/SegmenTron.git/segmentron/models/unet.py | FastSCNN/segmentron/models/unet.py | https://github.com/HaiyongJiang/U-Net-Pytorch-Unstructured-Buggy/commit/0e854509c2cea854e247a9c615f175f76fbb2e3a | 源码实现 | -| 开发引入 | / | FastSCNN/requirements.txt | http://github.com/jinfagang/DCNv2_latest.git@c14535d60cf781927dbce96e4f809630d424c6b5#egg=DCNv2 | 相关依赖 | -| 开发引入 | / | FastSCNN/requirements.txt | https://github.com/svenkreiss/poseval.git@3128c5cbcf90946e5164ff438ad651e113e64613 | 相关依赖 | -| 开发引入 | / | FastSCNN/requirements.txt | https://github.com/rwightman/pytorch-image-models.git@5aca7c01e53e4a2d4c8773d90dd86ed23574d755 | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/ADEChallengeData2016.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/ade20k.py | http://data.csail.mit.edu/places/ADEchallenge/release_test.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/mscoco.py | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/pascal_voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/pascal_voc.py | http://www.eecs.berkeley.edu/Research/Projects/CS/vision/grouping/semantic_contours/benchmark.tgz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/data/downloader/sbu_shadow.py | http://www3.cs.stonybrook.edu/~cvl/content/datasets/shadow_db/SBU-shadow.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/FastSCNN/segmentron/models/backbones/build.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/public_address_statement.md index 31bcf5316aaf288a91e933f2b5d4ffc2cd3c6b31..c9017df374a5496dede977eb344c799f171b0d9d 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/public_address_statement.md @@ -1,25 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------|-------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://opr0mq.dm.files.1drv.com/y4mIoWpP2n-LUohHHANpC0jrOixm1FZgO2OsUtP2DwIozH5RsoYVyv_De5wDgR6XuQmirMV3C0AljLeB-zQXevfLlnQpcNeJlT9Q8LwNYDwh3TsECkMTWXCUn3vDGJWpCxQcQWKONr5VQWO1hLEKPeJbbSZ6tgbWwJHgHF7592HY7ilmGe39o5BhHz7P9QqMYLBts6V7QGoaKrr0PL3wvvR4w | 下载预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://opr74a.dm.files.1drv.com/y4mKOuRSNGQQlp6wm_a9bF-UEQwp6a10xFCLhm4bqjDu6aSNW9yhDRM7qyx0vK0WTh42gEaniUVm3h7pg0H-W0yJff5qQtoAX7Zze4vOsqjoIthp-FW3nlfMD0-gcJi8IiVrMWqVOw2N3MbCud6uQQrTaEAvAdNjtjMpym1JghN-F060rSQKmgtq5R-wJe185IyW4-_c5_ItbhYpCyLxdqdEQ | 下载预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://optgaw.dm.files.1drv.com/y4mWNpya38VArcDInoPaL7GfPMgcop92G6YRkabO1QTSWkCbo7djk8BFZ6LK_KHHIYE8wqeSAChU58NVFOZEvqFaoz392OgcyBrq_f8XGkusQep_oQsuQ7DPQCUrdLwyze_NlsyDGWot0L9agkQ-M_SfNr10ETlCF5R7BdKDZdupmcMXZc-IE3Ysw1bVHdOH4l-XEbEKFAi6ivPUbeqlYkRMQ | 下载预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://optgaw.dm.files.1drv.com/y4mWNpya38VArcDInoPaL7GfPMgcop92G6YRkabO1QTSWkCbo7djk8BFZ6LK_KHHIYE8wqeSAChU58NVFOZEvqFaoz392OgcyBrq_f8XGkusQep_oQsuQ7DPQCUrdLwyze_NlsyDGWot0L9agkQ-M_SfNr10ETlCF5R7BdKDZdupmcMXZc-IE3Ysw1bVHdOH4l-XEbEKFAi6ivPUbeqlYkRMQ | 下载预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://optgaw.dm.files.1drv.com/y4mWNpya38VArcDInoPaL7GfPMgcop92G6YRkabO1QTSWkCbo7djk8BFZ6LK_KHHIYE8wqeSAChU58NVFOZEvqFaoz392OgcyBrq_f8XGkusQep_oQsuQ7DPQCUrdLwyze_NlsyDGWot0L9agkQ-M_SfNr10ETlCF5R7BdKDZdupmcMXZc-IE3Ysw1bVHdOH4l-XEbEKFAi6ivPUbeqlYkRMQ | 下载预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet50_ibn_a-d9d0bb7b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet50_ibn_b-9ca61e85.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/utils/modelsummary.py | Bin.Xiao@microsoft.com | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | yhyuan@pku.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet101_ibn_a-59ea0ac6.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/datasets/pascal_ctx.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/utils/modelsummary.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/datasets/ade20k.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/seg_hrnet.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet18_ibn_a-2f571257.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/datasets/lip.py | sunk@mail.ustc.edu.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/utils/utils.py | https://discuss.pytorch.org/t/dataparallel-imbalanced-memory-usage/22551/21 | 源码实现 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet34_ibn_a-94bc1577.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet18_ibn_b-bc2f3c11.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/datasets/pascal_ctx.py | https://github.com/zhanghang1989/PyTorch-Encoding | 源码实现 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet34_ibn_b-04134c37.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://github.com/XingangPan/IBN-Net/releases/download/v1.0/resnet101_ibn_b-c55f6dba.pth | 预训练模型 | -| 开源代码引入 | https://github.com/HRNet/HRNet-Semantic-Segmentation.git | HRNet_SEG_for_Pytorch/lib/datasets/cocostuff.py | sunk@mail.ustc.edu.cn | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://optgaw.dm.files.1drv.com/y4mWNpya38VArcDInoPaL7GfPMgcop92G6YRkabO1QTSWkCbo7djk8BFZ6LK_KHHIYE8wqeSAChU58NVFOZEvqFaoz392OgcyBrq_f8XGkusQep_oQsuQ7DPQCUrdLwyze_NlsyDGWot0L9agkQ-M_SfNr10ETlCF5R7BdKDZdupmcMXZc-IE3Ysw1bVHdOH4l-XEbEKFAi6ivPUbeqlYkRMQ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://optgaw.dm.files.1drv.com/y4mWNpya38VArcDInoPaL7GfPMgcop92G6YRkabO1QTSWkCbo7djk8BFZ6LK_KHHIYE8wqeSAChU58NVFOZEvqFaoz392OgcyBrq_f8XGkusQep_oQsuQ7DPQCUrdLwyze_NlsyDGWot0L9agkQ-M_SfNr10ETlCF5R7BdKDZdupmcMXZc-IE3Ysw1bVHdOH4l-XEbEKFAi6ivPUbeqlYkRMQ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://optgaw.dm.files.1drv.com/y4mWNpya38VArcDInoPaL7GfPMgcop92G6YRkabO1QTSWkCbo7djk8BFZ6LK_KHHIYE8wqeSAChU58NVFOZEvqFaoz392OgcyBrq_f8XGkusQep_oQsuQ7DPQCUrdLwyze_NlsyDGWot0L9agkQ-M_SfNr10ETlCF5R7BdKDZdupmcMXZc-IE3Ysw1bVHdOH4l-XEbEKFAi6ivPUbeqlYkRMQ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://opr74a.dm.files.1drv.com/y4mKOuRSNGQQlp6wm_a9bF-UEQwp6a10xFCLhm4bqjDu6aSNW9yhDRM7qyx0vK0WTh42gEaniUVm3h7pg0H-W0yJff5qQtoAX7Zze4vOsqjoIthp-FW3nlfMD0-gcJi8IiVrMWqVOw2N3MbCud6uQQrTaEAvAdNjtjMpym1JghN-F060rSQKmgtq5R-wJe185IyW4-_c5_ItbhYpCyLxdqdEQ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/lib/models/hrnet.py | https://opr0mq.dm.files.1drv.com/y4mIoWpP2n-LUohHHANpC0jrOixm1FZgO2OsUtP2DwIozH5RsoYVyv_De5wDgR6XuQmirMV3C0AljLeB-zQXevfLlnQpcNeJlT9Q8LwNYDwh3TsECkMTWXCUn3vDGJWpCxQcQWKONr5VQWO1hLEKPeJbbSZ6tgbWwJHgHF7592HY7ilmGe39o5BhHz7P9QqMYLBts6V7QGoaKrr0PL3wvvR4w | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRNet_SEG_for_Pytorch/tools/train_npu.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/public_address_statement.md index a440e1f6976fb4091a425460f15a7274e58d663b..602b27ec9d77fefba4e44c65659cb92a8ddd4b78 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/public_address_statement.md @@ -1,45 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---------------------------------------------------------|-----------------------------|------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开发引入 | / | HRnet-OCR/url.ini | https://github.com/NVIDIA/apex.git | 下载依赖 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext50_32x4d-a260b3a4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | http://data.lip6.fr/cadene/pretrainedmodels/se_resnext101_32x4d-3b2fe3d8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/rmi.py | https://github.com/pytorch/pytorch/issues/7500 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/xception.py | https://github.com/jfzhang95/pytorch-deeplab-xception/blob/master/modeling/backbone/xception.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/utils/f_boundary.py | https://github.com/fperazzi/davis/blob/master/python/lib/davis/measures/f_boundary.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/hrnetv2.py | https://github.com/HRNet/HRNet-Semantic-Segmentation/tree/HRNet-OCR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/rmi_utils.py | https://github.com/ZJULearning/RMI | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/rmi_utils.py | https://github.com/tensorflow/tensorflow/blob/r1.13/tensorflow/python/ops/linalg/linalg_impl.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/randaugment.py | https://github.com/rpmcruz/autoaugment/blob/master/transformations.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/randaugment.py | https://github.com/quark0/darts/blob/master/cnn/utils.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/ocrnet.py | https://github.com/HRNet/HRNet-Semantic-Segmentation/tree/HRNet-OCR | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/ocr_utils.py | https://github.com/HRNet/HRNet-Semantic-Segmentation | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/hrnetv2.py | sunk@mail.ustc.edu.cn","hsfzxjy@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/rmi_utils.py | https://www.pugetsystems.com/labs/hpc/PyTorch-for-Scientific-Computing | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/transforms/transforms.py | https://en.wikipedia.org/wiki/Hue | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/utils/my_data_parallel.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/parallel/data_parallel.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/randaugment.py | https://github.com/google-research/uda/blob/master/image/randaugment/policies.py#L57 | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/ocr_utils.py | sunk@mail.ustc.edu.cn","hsfzxjy@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/deepv3.py | https://github.com/sthalles/deeplab_v3 | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/utils.py | https://github.com/lingtengqiu/Deeperlab-pytorch/blob/master/seg_opr/seg_oprs.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/cityscapes_labels.py | www.cityscapes-dataset.net | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/SEresnext.py | https://github.com/Cadene/pretrained-models.pytorch | 预训练模型 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/wider_resnet.py | https://github.com/mapillary/inplace_abn/ | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/radam.py | https://github.com/LiyuanLucasLiu/RAdam | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/rmi.py | https://github.com/ZJULearning/RMI | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/rmi_utils.py | https://pytorch.org/docs/stable/nn.html?highlight=f%20pad#torch.nn.functional.pad | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/loss/radam.py | https://arxiv.org/abs/1908.03265 | 论文地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/randaugment.py | https://github.com/ildoonet/pytorch-randaugment | 相关说明 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/transforms/transforms.py | https://github.com/zijundeng/pytorch-semantic-segmentation/blob/master/utils/transforms.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/cityscapes_labels.py | https://github.com/mcordts/cityscapesScripts/ | 数据集地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/sampler.py | https://github.com/pytorch/pytorch/blob/master/torch/utils/data/distributed.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/utils/f_boundary.py | dmartin@eecs.berkeley.edu | 邮箱地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/transforms/joint_transforms.py | https://github.com/zijundeng/pytorch-semantic-segmentation/blob/master/utils/joint_transforms.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/datasets/cityscapes_labels.py | https://github.com/mcordts/cityscapesScripts/blob/master/license.txt | license地址 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/network/Resnet.py | https://github.com/pytorch/vision/blob/master/torchvision/models/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/utils/attr_dict.py | https://github.com/facebookresearch/Detectron/blob/master/detectron/utils/collections.py | 源码实现 | -| 开源代码引入 | https://github.com/huggingface/pytorch-image-models.git | HRnet-OCR/config.py | https://github.com/facebookresearch/Detectron/blob/master/detectron/core/config.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/HRnet-OCR/network/Resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/public_address_statement.md index 30575d628208fac41267213b1ce34523eff05fb9..344b2f8dff80938d02a20d153d00b416a4bd89ea 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/public_address_statement.md @@ -1,11 +1,11 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-----------------------------------------------------------------|----------------------------------------------------------|------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/models/model_store.py | https://hangzh.s3.amazonaws.com/ | 仓库地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/utils/logger.py | https://github.com/facebookresearch/maskrcnn-benchmark/blob/master/maskrcnn_benchmark/utils/logger.py | 源码实现 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/utils/lr_scheduler.py | https://blog.csdn.net/mieleizhi0522/article/details/83113824 | 论文地址 | -| 开源代码引入 | https://github.com/Tramac/awesome-semantic-segmentation-pytorch | ICNet_ID1781_for_PyTorch/dataset/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/helpers/labels.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/base_models/resnetv1b.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/ICNet_ID1781_for_PyTorch/models/model_store.py | https://hangzh.s3.amazonaws.com/ | 相关说明 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/IntraDA/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/IntraDA/public_address_statement.md index ed64ce5118f5c9c2194216d42853987463335641..b8bb3c8124f1ca5aaaaff84ade5bf591e8f13d5d 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/IntraDA/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/IntraDA/public_address_statement.md @@ -1,8 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|------------------------------------------|---------------------------|-----------------------------------------------------------------------|---------------| -| 开源代码引入 | https://github.com/feipan664/IntraDA.git | IntraDA/ADVENT/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | 下载工具 | -| 开源代码引入 | https://github.com/feipan664/IntraDA.git | IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6uyw4BMUTPHjxAwXjeu.woff2 | 前端css文件下载字体样式 | -| 开源代码引入 | https://github.com/feipan664/IntraDA.git | IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6uyw4BMUTPHjx4wXg.woff2 | 前端css文件下载字体样式 | -| 开源代码引入 | https://github.com/feipan664/IntraDA.git | IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6u9w4BMUTPHh6UVSwaPGR_p.woff2 | 前端css文件下载字体样式 | -| 开源代码引入 | https://github.com/feipan664/IntraDA.git | IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6u9w4BMUTPHh6UVSwiPGQ.woff2 | 前端css文件下载字体样式 | -| 开源代码引入 | https://github.com/feipan664/IntraDA.git | IntraDA/ADVENT/advent/domain_adaptation/config.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/fast_rcnn/config.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|-----------------------------------------------------------------------|----------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/IntraDA/ADVENT/Dockerfile | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6uyw4BMUTPHjxAwXjeu.woff2 | 前端css文件下载字体样式 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6uyw4BMUTPHjx4wXg.woff2 | 前端css文件下载字体样式 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6u9w4BMUTPHh6UVSwiPGQ.woff2 | 前端css文件下载字体样式 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/IntraDA/figure/font.css | https://fonts.gstatic.com/s/lato/v16/S6u9w4BMUTPHh6UVSwaPGR_p.woff2 | 前端css文件下载字体样式 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/public_address_statement.md index 537a9a6b075acc3fd9ebae0c0c0836306b628766..d3f5df5d67e6b7158c1d542406e6cf8aff368d6e 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/public_address_statement.md @@ -1,1268 +1,861 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation |MMseg-swin/setup.py | http://github.com/open-mmlab/mmsegmentation | setuptools在开源社区的url的配置选项| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/setup.py | MMseg-swin/mmcv/setup.py | openmmlab@gmail.com | setuptools在开源社区的author邮箱的配置选项| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/_base_/models/segmenter_vit-b16_mask.py|MMseg-swin/configs/_base_/models/segmenter_vit-b16_mask.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_base_p16_384_20220308-96dfe169.pth | segmenter_vit-b16_mask在开源社区上的权重pth链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/_base_/models/twins_pcpvt-s_upernet.py|MMseg-swin/configs/_base_/models/twins_pcpvt-s_fpn.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | pcpvt_small_20220308-e638c41c在开源社区上的权重pth链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/_base_/models/twins_pcpvt-s_upernet.py|MMseg-swin/configs/_base_/models/twins_pcpvt-s_upernet.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | pcpvt_small_20220308-e638c41c在开源社区上的权重pth链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/_base_/models/upernet_convnext.py|MMseg-swin/configs/_base_/models/upernet_convnext.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_32xb128-noema_in1k_20220301-2a0ee547.pth | convnext-base_3rdparty_32xb128-noema_in1k_20220301-2a0ee547在开源社区上的权重pth链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://arxiv.org/abs/1908.0767 | ann模型yml文件中的论文url链接声明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/ann_head.py#L18 | ann模型在开源社区上的decode_heads下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://github.com/MendelXu/AN | ann模型在开源社区上的源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/visualization_feature_map.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_40k_cityscapes/ann_r50-d8_512x1024_40k_cityscapes_20200605_095211-049fc292.pt | ann模型在开源社区上的ann_r50-d8_512x1024_40k_cityscapes_20200605_095211-049fc292.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_40k_cityscapes/ann_r101-d8_512x1024_40k_cityscapes_20200605_095243-adf6eece.pt | ann模型在开源社区上的ann_r101-d8_512x1024_40k_cityscapes_20200605_095243-adf6eece.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_40k_cityscapes/ann_r50-d8_769x769_40k_cityscapes_20200530_025712-2b46b04d.pt | ann模型在开源社区上的ann_r50-d8_769x769_40k_cityscapes_20200530_025712-2b46b04d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_40k_cityscapes/ann_r101-d8_769x769_40k_cityscapes_20200530_025720-059bff28.pt | ann模型在开源社区上的ann_r101-d8_769x769_40k_cityscapes_20200530_025720-059bff28.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_80k_cityscapes/ann_r50-d8_512x1024_80k_cityscapes_20200607_101911-5a9ad545.pt | ann模型在开源社区上的ann_r50-d8_512x1024_80k_cityscapes_20200607_101911-5a9ad545.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_80k_cityscapes/ann_r101-d8_512x1024_80k_cityscapes_20200607_013728-aceccc6e.pt | ann模型在开源社区上的ann_r101-d8_512x1024_80k_cityscapes_20200607_013728-aceccc6e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_80k_cityscapes/ann_r50-d8_769x769_80k_cityscapes_20200607_044426-cc7ff323.pt | ann模型在开源社区上的ann_r50-d8_769x769_80k_cityscapes_20200607_044426-cc7ff323.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_80k_cityscapes/ann_r101-d8_769x769_80k_cityscapes_20200607_013713-a9d4be8d.pt | ann模型在开源社区上的ann_r101-d8_769x769_80k_cityscapes_20200607_013713-a9d4be8d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_80k_ade20k/ann_r50-d8_512x512_80k_ade20k_20200615_014818-26f75e11.pt | ann模型在开源社区上的ann_r50-d8_512x512_80k_ade20k_20200615_014818-26f75e11.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_80k_ade20k/ann_r101-d8_512x512_80k_ade20k_20200615_014818-c0153543.pt | ann模型在开源社区上的ann_r101-d8_512x512_80k_ade20k_20200615_014818-c0153543.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_160k_ade20k/ann_r50-d8_512x512_160k_ade20k_20200615_231733-892247bc.pt | ann模型在开源社区上的ann_r50-d8_512x512_160k_ade20k_20200615_231733-892247bc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_160k_ade20k/ann_r101-d8_512x512_160k_ade20k_20200615_231733-955eb1ec.pt | ann模型在开源社区上的ann_r101-d8_512x512_160k_ade20k_20200615_231733-955eb1ec.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_20k_voc12aug/ann_r50-d8_512x512_20k_voc12aug_20200617_222246-dfcb1c62.pt | ann模型在开源社区上的ann_r50-d8_512x512_20k_voc12aug_20200617_222246-dfcb1c62.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_20k_voc12aug/ann_r101-d8_512x512_20k_voc12aug_20200617_222246-2fad0042.pt | ann模型在开源社区上的ann_r101-d8_512x512_20k_voc12aug_20200617_222246-2fad0042.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_40k_voc12aug/ann_r50-d8_512x512_40k_voc12aug_20200613_231314-b5dac322.pt | ann模型在开源社区上的ann_r50-d8_512x512_40k_voc12aug_20200613_231314-b5dac322.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ann/README.md|MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_40k_voc12aug/ann_r101-d8_512x512_40k_voc12aug_20200613_231314-bd205bbe.pt | ann模型在开源社区上的ann_r101-d8_512x512_40k_voc12aug_20200613_231314-bd205bbe.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://openaccess.thecvf.com/content_CVPR_2019/html/He_Adaptive_Pyramid_Context_Network_for_Semantic_Segmentation_CVPR_2019_paper.htm | content_CVPR_2019开源论文下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/apc_head.py#L11 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://github.com/Junjun2016/APCNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_40k_cityscapes/apcnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pt | apcnet模型在开源社区上的apcnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_40k_cityscapes/apcnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pt | apcnet模型在开源社区上的apcnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_40k_cityscapes/apcnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pt | apcnet模型在开源社区上的apcnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_40k_cityscapes/apcnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pt | apcnet模型在开源社区上的apcnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_80k_cityscapes/apcnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pt | apcnet模型在开源社区上的apcnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_80k_cityscapes/apcnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pt | apcnet模型在开源社区上的apcnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_80k_cityscapes/apcnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pt | apcnet模型在开源社区上的apcnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_80k_cityscapes/apcnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pt | apcnet模型在开源社区上的apcnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_80k_ade20k/apcnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pt | apcnet模型在开源社区上的apcnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_80k_ade20k/apcnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pt | apcnet模型在开源社区上的apcnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_160k_ade20k/apcnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pt | apcnet模型在开源社区上的apcnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_160k_ade20k/apcnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pt | apcnet模型在开源社区上的apcnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/beit/README.md|MMseg-swin/configs/beit/beit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/beit/upernet_beit-base_8x2_640x640_160k_ade20k/upernet_beit-base_8x2_640x640_160k_ade20k-eead221d.pt | beit模型在开源社区上的upernet_beit-base_8x2_640x640_160k_ade20k-eead221d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/beit/README.md|MMseg-swin/configs/beit/beit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/beit/upernet_beit-large_fp16_8x1_640x640_160k_ade20k/upernet_beit-large_fp16_8x1_640x640_160k_ade20k-8fc0dd5d.pt | beit模型在开源社区上的upernet_beit-large_fp16_8x1_640x640_160k_ade20k-8fc0dd5d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://arxiv.org/abs/1808.0089 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.18.0/mmseg/models/backbones/bisenetv1.py#L26 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://github.com/ycszen/TorchSeg/tree/master/model/bisene | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes_20210922_172239-c55e78e2.pt | bisenetv1模型在开源社区上的bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes_20210922_172239-c55e78e2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210905_220251-8ba80eff.pt | bisenetv1模型在开源社区上的bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210905_220251-8ba80eff.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes_20210905_220322-bb8db75f.pt | bisenetv1模型在开源社区上的bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes_20210905_220322-bb8db75f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes_20210923_222639-7b28a2a6.pt | bisenetv1模型在开源社区上的bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes_20210923_222639-7b28a2a6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210917_234628-8b304447.pt | bisenetv1模型在开源社区上的bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210917_234628-8b304447.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211022_054328-046aa2f2.pt | bisenetv1模型在开源社区上的bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211022_054328-046aa2f2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211023_013100-f700dbf7.pt | bisenetv1模型在开源社区上的bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211023_013100-f700dbf7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_040616-d2bb0df4.pt | bisenetv1模型在开源社区上的bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_040616-d2bb0df4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_181932-66747911.pt | bisenetv1模型在开源社区上的bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_181932-66747911.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211102_164147-c6b32c3b.pt | bisenetv1模型在开源社区上的bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211102_164147-c6b32c3b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv1/README.md|MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_225220-28c8f092.pt | bisenetv1模型在开源社区上的bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_225220-28c8f092.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv2/README.md|MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://arxiv.org/abs/2004.0214 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv2/README.md|MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.18.0/mmseg/models/backbones/bisenetv2.py#L54 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv2/README.md|MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_4x4_1024x1024_160k_cityscapes_20210902_015551-bcf10f09.pt | bisenetv2模型在开源社区上的bisenetv2_fcn_4x4_1024x1024_160k_cityscapes_20210902_015551-bcf10f09.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv2/README.md|MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes_20210902_112947-5f8103b4.pt | bisenetv2模型在开源社区上的bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes_20210902_112947-5f8103b4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv2/README.md|MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_4x8_1024x1024_160k_cityscapes/bisenetv2_fcn_4x8_1024x1024_160k_cityscapes_20210903_000032-e1a2eed6.pt | bisenetv2模型在开源社区上的bisenetv2_fcn_4x8_1024x1024_160k_cityscapes_20210903_000032-e1a2eed6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/bisenetv2/README.md|MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes_20210902_045942-b979777b.pt | bisenetv2模型在开源社区上的bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes_20210902_045942-b979777b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://arxiv.org/abs/1811.1172 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/apcnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/apc_head.py#L11 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://github.com/speedinghzl/CCNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_40k_cityscapes/ccnet_r50-d8_512x1024_40k_cityscapes_20200616_142517-4123f401.pt | ccnet模型在开源社区上的ccnet_r50-d8_512x1024_40k_cityscapes_20200616_142517-4123f401.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_40k_cityscapes/ccnet_r101-d8_512x1024_40k_cityscapes_20200616_142540-a3b84ba6.pt | ccnet模型在开源社区上的ccnet_r101-d8_512x1024_40k_cityscapes_20200616_142540-a3b84ba6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_40k_cityscapes/ccnet_r50-d8_769x769_40k_cityscapes_20200616_145125-76d11884.pt | ccnet模型在开源社区上的ccnet_r50-d8_769x769_40k_cityscapes_20200616_145125-76d11884.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_40k_cityscapes/ccnet_r101-d8_769x769_40k_cityscapes_20200617_101428-4f57c8d0.pt | ccnet模型在开源社区上的ccnet_r101-d8_769x769_40k_cityscapes_20200617_101428-4f57c8d0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_80k_cityscapes/ccnet_r50-d8_512x1024_80k_cityscapes_20200617_010421-869a3423.pt | ccnet模型在开源社区上的ccnet_r50-d8_512x1024_80k_cityscapes_20200617_010421-869a3423.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_80k_cityscapes/ccnet_r101-d8_512x1024_80k_cityscapes_20200617_203935-ffae8917.pt | ccnet模型在开源社区上的ccnet_r101-d8_512x1024_80k_cityscapes_20200617_203935-ffae8917.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_80k_cityscapes/ccnet_r50-d8_769x769_80k_cityscapes_20200617_010421-73eed8ca.pt | ccnet模型在开源社区上的ccnet_r50-d8_769x769_80k_cityscapes_20200617_010421-73eed8ca.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_80k_cityscapes/ccnet_r101-d8_769x769_80k_cityscapes_20200618_011502-ad3cd481.pt | ccnet模型在开源社区上的ccnet_r101-d8_769x769_80k_cityscapes_20200618_011502-ad3cd481.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_80k_ade20k/ccnet_r50-d8_512x512_80k_ade20k_20200615_014848-aa37f61e.pt | ccnet模型在开源社区上的ccnet_r50-d8_512x512_80k_ade20k_20200615_014848-aa37f61e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_80k_ade20k/ccnet_r101-d8_512x512_80k_ade20k_20200615_014848-1f4929a3.pt | ccnet模型在开源社区上的ccnet_r101-d8_512x512_80k_ade20k_20200615_014848-1f4929a3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_160k_ade20k/ccnet_r50-d8_512x512_160k_ade20k_20200616_084435-7c97193b.pt | ccnet模型在开源社区上的ccnet_r50-d8_512x512_160k_ade20k_20200616_084435-7c97193b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_160k_ade20k/ccnet_r101-d8_512x512_160k_ade20k_20200616_000644-e849e007.pt | ccnet模型在开源社区上的ccnet_r101-d8_512x512_160k_ade20k_20200616_000644-e849e007.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_20k_voc12aug/ccnet_r50-d8_512x512_20k_voc12aug_20200617_193212-fad81784.pt | ccnet模型在开源社区上的ccnet_r50-d8_512x512_20k_voc12aug_20200617_193212-fad81784.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_20k_voc12aug/ccnet_r101-d8_512x512_20k_voc12aug_20200617_193212-0007b61d.pt | ccnet模型在开源社区上的ccnet_r101-d8_512x512_20k_voc12aug_20200617_193212-0007b61d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_40k_voc12aug/ccnet_r50-d8_512x512_40k_voc12aug_20200613_232127-c2a15f02.pt | ccnet模型在开源社区上的ccnet_r50-d8_512x512_40k_voc12aug_20200613_232127-c2a15f02.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ccnet/README.md|MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_40k_voc12aug/ccnet_r101-d8_512x512_40k_voc12aug_20200613_232127-c30da577.pt | ccnet模型在开源社区上的ccnet_r101-d8_512x512_40k_voc12aug_20200613_232127-c30da577.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/cgnet/README.md|MMseg-swin/configs/cgnet/cgnet.yml | https://arxiv.org/abs/1811.0820 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/cgnet/README.md|MMseg-swin/configs/cgnet/cgnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/backbones/cgnet.py#L18 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/cgnet/README.md|MMseg-swin/configs/cgnet/cgnet.yml | https://github.com/wutianyiRosun/CGNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/cgnet/README.md|MMseg-swin/configs/cgnet/cgnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_680x680_60k_cityscapes/cgnet_680x680_60k_cityscapes_20201101_110253-4c0b2f2d.pt | cgnet模型在开源社区上的cgnet_680x680_60k_cityscapes_20201101_110253-4c0b2f2d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/cgnet/README.md|MMseg-swin/configs/cgnet/cgnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_512x1024_60k_cityscapes/cgnet_512x1024_60k_cityscapes_20201101_110254-124ea03b.pt | cgnet模型在开源社区上的cgnet_512x1024_60k_cityscapes_20201101_110254-124ea03b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/convnext/README.md|MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k/upernet_convnext_tiny_fp16_512x512_160k_ade20k_20220227_124553-cad485de.pt | convnext模型在开源社区上的upernet_convnext_tiny_fp16_512x512_160k_ade20k_20220227_124553-cad485de.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/convnext/README.md|MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k/upernet_convnext_small_fp16_512x512_160k_ade20k_20220227_131208-1b1e394f.pt | convnext模型在开源社区上的upernet_convnext_small_fp16_512x512_160k_ade20k_20220227_131208-1b1e394f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/convnext/README.md|MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_base_fp16_512x512_160k_ade20k/upernet_convnext_base_fp16_512x512_160k_ade20k_20220227_181227-02a24fc6.pt | convnext模型在开源社区上的upernet_convnext_base_fp16_512x512_160k_ade20k_20220227_181227-02a24fc6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/convnext/README.md|MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k/upernet_convnext_base_fp16_640x640_160k_ade20k_20220227_182859-9280e39b.pt | convnext模型在开源社区上的upernet_convnext_base_fp16_640x640_160k_ade20k_20220227_182859-9280e39b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/convnext/README.md|MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k/upernet_convnext_large_fp16_640x640_160k_ade20k_20220226_040532-e57aa54d.pt | convnext模型在开源社区上的upernet_convnext_large_fp16_640x640_160k_ade20k_20220226_040532-e57aa54d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/convnext/README.md|MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k/upernet_convnext_xlarge_fp16_640x640_160k_ade20k_20220226_080344-95fc38c2.pt | convnext模型在开源社区上的upernet_convnext_xlarge_fp16_640x640_160k_ade20k_20220226_080344-95fc38c2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://arxiv.org/abs/1809.0298 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/da_head.py#L7 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://github.com/junfu1115/DANet | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_40k_cityscapes/danet_r50-d8_512x1024_40k_cityscapes_20200605_191324-c0dbfa5f.pt | danet模型在开源社区上的danet_r50-d8_512x1024_40k_cityscapes_20200605_191324-c0dbfa5f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_40k_cityscapes/danet_r101-d8_512x1024_40k_cityscapes_20200605_200831-c57a7157.pt | danet模型在开源社区上的danet_r101-d8_512x1024_40k_cityscapes_20200605_200831-c57a7157.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_40k_cityscapes/danet_r50-d8_769x769_40k_cityscapes_20200530_025703-76681c60.pt | danet模型在开源社区上的danet_r50-d8_769x769_40k_cityscapes_20200530_025703-76681c60.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_40k_cityscapes/danet_r101-d8_769x769_40k_cityscapes_20200530_025717-dcb7fd4e.pt | danet模型在开源社区上的danet_r101-d8_769x769_40k_cityscapes_20200530_025717-dcb7fd4e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_80k_cityscapes/danet_r50-d8_512x1024_80k_cityscapes_20200607_133029-2bfa2293.pt | danet模型在开源社区上的danet_r50-d8_512x1024_80k_cityscapes_20200607_133029-2bfa2293.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_80k_cityscapes/danet_r101-d8_512x1024_80k_cityscapes_20200607_132918-955e6350.pt | danet模型在开源社区上的danet_r101-d8_512x1024_80k_cityscapes_20200607_132918-955e6350.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_80k_cityscapes/danet_r50-d8_769x769_80k_cityscapes_20200607_132954-495689b4.pt | danet模型在开源社区上的danet_r50-d8_769x769_80k_cityscapes_20200607_132954-495689b4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_80k_cityscapes/danet_r101-d8_769x769_80k_cityscapes_20200607_132918-f3a929e7.pt | danet模型在开源社区上的danet_r101-d8_769x769_80k_cityscapes_20200607_132918-f3a929e7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_80k_ade20k/danet_r50-d8_512x512_80k_ade20k_20200615_015125-edb18e08.pt | danet模型在开源社区上的danet_r50-d8_512x512_80k_ade20k_20200615_015125-edb18e08.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_80k_ade20k/danet_r101-d8_512x512_80k_ade20k_20200615_015126-d0357c73.pt | danet模型在开源社区上的danet_r101-d8_512x512_80k_ade20k_20200615_015126-d0357c73.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_160k_ade20k/danet_r50-d8_512x512_160k_ade20k_20200616_082340-9cb35dcd.pt | danet模型在开源社区上的danet_r50-d8_512x512_160k_ade20k_20200616_082340-9cb35dcd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_160k_ade20k/danet_r101-d8_512x512_160k_ade20k_20200616_082348-23bf12f9.pt | danet模型在开源社区上的danet_r101-d8_512x512_160k_ade20k_20200616_082348-23bf12f9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_20k_voc12aug/danet_r50-d8_512x512_20k_voc12aug_20200618_070026-9e9e3ab3.pt | danet模型在开源社区上的danet_r50-d8_512x512_20k_voc12aug_20200618_070026-9e9e3ab3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_20k_voc12aug/danet_r101-d8_512x512_20k_voc12aug_20200618_070026-d48d23b2.pt | danet模型在开源社区上的danet_r101-d8_512x512_20k_voc12aug_20200618_070026-d48d23b2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_40k_voc12aug/danet_r50-d8_512x512_40k_voc12aug_20200613_235526-426e3a64.pt | danet模型在开源社区上的danet_r50-d8_512x512_40k_voc12aug_20200613_235526-426e3a64.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/danet/README.md|MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_40k_voc12aug/danet_r101-d8_512x512_40k_voc12aug_20200613_223031-788e232a.pt | danet模型在开源社区上的danet_r101-d8_512x512_40k_voc12aug_20200613_223031-788e232a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://arxiv.org/abs/1706.0558 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/aspp_head.py#L5 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://github.com/tensorflow/models/tree/master/research/deepla | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_40k_cityscapes/deeplabv3_r50-d8_512x1024_40k_cityscapes_20200605_022449-acadc2f8.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x1024_40k_cityscapes_20200605_022449-acadc2f8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_40k_cityscapes/deeplabv3_r101-d8_512x1024_40k_cityscapes_20200605_012241-7fd3f799.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x1024_40k_cityscapes_20200605_012241-7fd3f799.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_40k_cityscapes/deeplabv3_r50-d8_769x769_40k_cityscapes_20200606_113723-7eda553c.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_769x769_40k_cityscapes_20200606_113723-7eda553c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_40k_cityscapes/deeplabv3_r101-d8_769x769_40k_cityscapes_20200606_113809-c64f889f.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_769x769_40k_cityscapes_20200606_113809-c64f889f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_512x1024_80k_cityscapes/deeplabv3_r18-d8_512x1024_80k_cityscapes_20201225_021506-23dffbe2.pt | deeplabv3模型在开源社区上的deeplabv3_r18-d8_512x1024_80k_cityscapes_20201225_021506-23dffbe2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_80k_cityscapes/deeplabv3_r50-d8_512x1024_80k_cityscapes_20200606_113404-b92cfdd4.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x1024_80k_cityscapes_20200606_113404-b92cfdd4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_80k_cityscapes/deeplabv3_r101-d8_512x1024_80k_cityscapes_20200606_113503-9e428899.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x1024_80k_cityscapes_20200606_113503-9e428899.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-774d9cec.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-774d9cec.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_769x769_80k_cityscapes/deeplabv3_r18-d8_769x769_80k_cityscapes_20201225_021506-6452126a.pt | deeplabv3模型在开源社区上的deeplabv3_r18-d8_769x769_80k_cityscapes_20201225_021506-6452126a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_80k_cityscapes/deeplabv3_r50-d8_769x769_80k_cityscapes_20200606_221338-788d6228.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_769x769_80k_cityscapes_20200606_221338-788d6228.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_80k_cityscapes/deeplabv3_r101-d8_769x769_80k_cityscapes_20200607_013353-60e95418.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_769x769_80k_cityscapes_20200607_013353-60e95418.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-57bb8425.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-57bb8425.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_512x1024_80k_cityscapes/deeplabv3_r18b-d8_512x1024_80k_cityscapes_20201225_094144-46040cef.pt | deeplabv3模型在开源社区上的deeplabv3_r18b-d8_512x1024_80k_cityscapes_20201225_094144-46040cef.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_512x1024_80k_cityscapes/deeplabv3_r50b-d8_512x1024_80k_cityscapes_20201225_155148-ec368954.pt | deeplabv3模型在开源社区上的deeplabv3_r50b-d8_512x1024_80k_cityscapes_20201225_155148-ec368954.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_512x1024_80k_cityscapes/deeplabv3_r101b-d8_512x1024_80k_cityscapes_20201226_171821-8fd49503.pt | deeplabv3模型在开源社区上的deeplabv3_r101b-d8_512x1024_80k_cityscapes_20201226_171821-8fd49503.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_769x769_80k_cityscapes/deeplabv3_r18b-d8_769x769_80k_cityscapes_20201225_094144-fdc985d9.pt | deeplabv3模型在开源社区上的deeplabv3_r18b-d8_769x769_80k_cityscapes_20201225_094144-fdc985d9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_769x769_80k_cityscapes/deeplabv3_r50b-d8_769x769_80k_cityscapes_20201225_155404-87fb0cf4.pt | deeplabv3模型在开源社区上的deeplabv3_r50b-d8_769x769_80k_cityscapes_20201225_155404-87fb0cf4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_769x769_80k_cityscapes/deeplabv3_r101b-d8_769x769_80k_cityscapes_20201226_190843-9142ee57.pt | deeplabv3模型在开源社区上的deeplabv3_r101b-d8_769x769_80k_cityscapes_20201226_190843-9142ee57.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_80k_ade20k/deeplabv3_r50-d8_512x512_80k_ade20k_20200614_185028-0bb3f844.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_80k_ade20k_20200614_185028-0bb3f844.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_80k_ade20k/deeplabv3_r101-d8_512x512_80k_ade20k_20200615_021256-d89c7fa4.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_80k_ade20k_20200615_021256-d89c7fa4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_160k_ade20k/deeplabv3_r50-d8_512x512_160k_ade20k_20200615_123227-5d0ee427.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_160k_ade20k_20200615_123227-5d0ee427.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_160k_ade20k/deeplabv3_r101-d8_512x512_160k_ade20k_20200615_105816-b1f72b3b.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_160k_ade20k_20200615_105816-b1f72b3b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_20k_voc12aug/deeplabv3_r50-d8_512x512_20k_voc12aug_20200617_010906-596905ef.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_20k_voc12aug_20200617_010906-596905ef.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_20k_voc12aug/deeplabv3_r101-d8_512x512_20k_voc12aug_20200617_010932-8d13832f.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_20k_voc12aug_20200617_010932-8d13832f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_40k_voc12aug/deeplabv3_r50-d8_512x512_40k_voc12aug_20200613_161546-2ae96e7e.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_40k_voc12aug_20200613_161546-2ae96e7e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_40k_voc12aug/deeplabv3_r101-d8_512x512_40k_voc12aug_20200613_161432-0017d784.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_40k_voc12aug_20200613_161432-0017d784.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context/deeplabv3_r101-d8_480x480_40k_pascal_context_20200911_204118-1aa27336.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_480x480_40k_pascal_context_20200911_204118-1aa27336.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context/deeplabv3_r101-d8_480x480_80k_pascal_context_20200911_170155-2a21fff3.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_480x480_80k_pascal_context_20200911_170155-2a21fff3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context_59/deeplabv3_r101-d8_480x480_40k_pascal_context_59_20210416_110332-cb08ea46.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_480x480_40k_pascal_context_59_20210416_110332-cb08ea46.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context_59/deeplabv3_r101-d8_480x480_80k_pascal_context_59_20210416_113002-26303993.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_480x480_80k_pascal_context_59_20210416_113002-26303993.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-b35f789d.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-b35f789d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-c49752cb.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-c49752cb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-dc76f3ff.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-dc76f3ff.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-636cb433.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-636cb433.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k_20210709_163016-88675c24.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k_20210709_163016-88675c24.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k_20210709_201252-13600dc2.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k_20210709_201252-13600dc2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k_20210709_163016-49f2812b.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k_20210709_163016-49f2812b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k_20210709_155402-f035acfd.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k_20210709_155402-f035acfd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k_20210709_155403-51b21115.pt | deeplabv3模型在开源社区上的deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k_20210709_155403-51b21115.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3/README.md|MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k_20210709_155402-3cbca14d.pt | deeplabv3模型在开源社区上的deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k_20210709_155402-3cbca14d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/projects/hssn/decode_head/sep_aspp_contrast_head.py|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://arxiv.org/abs/1802.0261 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/sep_aspp_head.py#L3 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://github.com/tensorflow/models/tree/master/research/deepla | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_40k_cityscapes/deeplabv3plus_r50-d8_512x1024_40k_cityscapes_20200605_094610-d222ffcd.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x1024_40k_cityscapes_20200605_094610-d222ffcd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_40k_cityscapes/deeplabv3plus_r101-d8_512x1024_40k_cityscapes_20200605_094614-3769eecf.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x1024_40k_cityscapes_20200605_094614-3769eecf.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_40k_cityscapes/deeplabv3plus_r50-d8_769x769_40k_cityscapes_20200606_114143-1dcb0e3c.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_769x769_40k_cityscapes_20200606_114143-1dcb0e3c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_40k_cityscapes/deeplabv3plus_r101-d8_769x769_40k_cityscapes_20200606_114304-ff414b9e.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_769x769_40k_cityscapes_20200606_114304-ff414b9e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x1024_80k_cityscapes/deeplabv3plus_r18-d8_512x1024_80k_cityscapes_20201226_080942-cff257fe.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18-d8_512x1024_80k_cityscapes_20201226_080942-cff257fe.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_80k_cityscapes/deeplabv3plus_r50-d8_512x1024_80k_cityscapes_20200606_114049-f9fb496d.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x1024_80k_cityscapes_20200606_114049-f9fb496d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_cityscapes_20200606_114143-068fcfe9.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x1024_80k_cityscapes_20200606_114143-068fcfe9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-f1104f4b.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-f1104f4b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_769x769_80k_cityscapes/deeplabv3plus_r18-d8_769x769_80k_cityscapes_20201226_083346-f326e06a.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18-d8_769x769_80k_cityscapes_20201226_083346-f326e06a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_80k_cityscapes/deeplabv3plus_r50-d8_769x769_80k_cityscapes_20200606_210233-0e9dfdc4.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_769x769_80k_cityscapes_20200606_210233-0e9dfdc4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_80k_cityscapes/deeplabv3plus_r101-d8_769x769_80k_cityscapes_20220406_154720-dfcc0b68.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_769x769_80k_cityscapes_20220406_154720-dfcc0b68.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-cf9ce186.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-cf9ce186.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-ee6158e0.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-ee6158e0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes_20201226_090828-e451abd9.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18b-d8_512x1024_80k_cityscapes_20201226_090828-e451abd9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes_20201225_213645-a97e4e43.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50b-d8_512x1024_80k_cityscapes_20201225_213645-a97e4e43.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes_20201226_190843-9c3c93a4.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101b-d8_512x1024_80k_cityscapes_20201226_190843-9c3c93a4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_769x769_80k_cityscapes/deeplabv3plus_r18b-d8_769x769_80k_cityscapes_20201226_151312-2c868aff.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18b-d8_769x769_80k_cityscapes_20201226_151312-2c868aff.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_769x769_80k_cityscapes/deeplabv3plus_r50b-d8_769x769_80k_cityscapes_20201225_224655-8b596d1c.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50b-d8_769x769_80k_cityscapes_20201225_224655-8b596d1c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_769x769_80k_cityscapes/deeplabv3plus_r101b-d8_769x769_80k_cityscapes_20201226_205041-227cdf7c.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101b-d8_769x769_80k_cityscapes_20201226_205041-227cdf7c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_ade20k/deeplabv3plus_r50-d8_512x512_80k_ade20k_20200614_185028-bf1400d8.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x512_80k_ade20k_20200614_185028-bf1400d8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_ade20k/deeplabv3plus_r101-d8_512x512_80k_ade20k_20200615_014139-d5730af7.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x512_80k_ade20k_20200615_014139-d5730af7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_160k_ade20k/deeplabv3plus_r50-d8_512x512_160k_ade20k_20200615_124504-6135c7e0.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x512_160k_ade20k_20200615_124504-6135c7e0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_160k_ade20k/deeplabv3plus_r101-d8_512x512_160k_ade20k_20200615_123232-38ed86bb.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x512_160k_ade20k_20200615_123232-38ed86bb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_20k_voc12aug/deeplabv3plus_r50-d8_512x512_20k_voc12aug_20200617_102323-aad58ef1.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x512_20k_voc12aug_20200617_102323-aad58ef1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_20k_voc12aug/deeplabv3plus_r101-d8_512x512_20k_voc12aug_20200617_102345-c7ff3d56.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x512_20k_voc12aug_20200617_102345-c7ff3d56.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_40k_voc12aug/deeplabv3plus_r50-d8_512x512_40k_voc12aug_20200613_161759-e1b43aa9.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x512_40k_voc12aug_20200613_161759-e1b43aa9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_40k_voc12aug/deeplabv3plus_r101-d8_512x512_40k_voc12aug_20200613_205333-faf03387.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x512_40k_voc12aug_20200613_205333-faf03387.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context/deeplabv3plus_r101-d8_480x480_40k_pascal_context_20200911_165459-d3c8a29e.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_480x480_40k_pascal_context_20200911_165459-d3c8a29e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context/deeplabv3plus_r101-d8_480x480_80k_pascal_context_20200911_155322-145d3ee8.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_480x480_80k_pascal_context_20200911_155322-145d3ee8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59_20210416_111233-ed937f15.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_480x480_40k_pascal_context_59_20210416_111233-ed937f15.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59_20210416_111127-7ca0331d.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_480x480_80k_pascal_context_59_20210416_111127-7ca0331d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x512_80k_loveda/deeplabv3plus_r18-d8_512x512_80k_loveda_20211104_132800-ce0fa0ca.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18-d8_512x512_80k_loveda_20211104_132800-ce0fa0ca.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_loveda/deeplabv3plus_r50-d8_512x512_80k_loveda_20211105_080442-f0720392.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x512_80k_loveda_20211105_080442-f0720392.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_loveda/deeplabv3plus_r101-d8_512x512_80k_loveda_20211105_110759-4c1f297e.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x512_80k_loveda_20211105_110759-4c1f297e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x512_80k_potsdam/deeplabv3plus_r18-d8_512x512_80k_potsdam_20211219_020601-75fd5bc3.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18-d8_512x512_80k_potsdam_20211219_020601-75fd5bc3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_potsdam/deeplabv3plus_r50-d8_512x512_80k_potsdam_20211219_031508-7e7a2b24.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_512x512_80k_potsdam_20211219_031508-7e7a2b24.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_potsdam/deeplabv3plus_r101-d8_512x512_80k_potsdam_20211219_031508-8b112708.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_512x512_80k_potsdam_20211219_031508-8b112708.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen_20211231_230805-7626a263.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen_20211231_230805-7626a263.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen_20211231_230816-5040938d.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen_20211231_230816-5040938d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen_20211231_230816-8a095afa.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen_20211231_230816-8a095afa.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_4x4_896x896_80k_isaid/deeplabv3plus_r18-d8_4x4_896x896_80k_isaid_20220110_180526-7059991d.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r18-d8_4x4_896x896_80k_isaid_20220110_180526-7059991d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/deeplabv3plus/README.md|MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_4x4_896x896_80k_isaid/deeplabv3plus_r50-d8_4x4_896x896_80k_isaid_20220110_180526-598be439.pt | deeplabv3plus模型在开源社区上的deeplabv3plus_r50-d8_4x4_896x896_80k_isaid_20220110_180526-598be439.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://openaccess.thecvf.com/content_ICCV_2019/papers/He_Dynamic_Multi-Scale_Filters_for_Semantic_Segmentation_ICCV_2019_paper.pd | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/dm_head.py#L9 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://github.com/Junjun2016/DMNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_40k_cityscapes/dmnet_r50-d8_512x1024_40k_cityscapes_20201215_042326-615373cf.pt | dmnet模型在开源社区上的dmnet_r50-d8_512x1024_40k_cityscapes_20201215_042326-615373cf.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_40k_cityscapes/dmnet_r101-d8_512x1024_40k_cityscapes_20201215_043100-8291e976.pt | dmnet模型在开源社区上的dmnet_r101-d8_512x1024_40k_cityscapes_20201215_043100-8291e976.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_40k_cityscapes/dmnet_r50-d8_769x769_40k_cityscapes_20201215_093706-e7f0e23e.pt | dmnet模型在开源社区上的dmnet_r50-d8_769x769_40k_cityscapes_20201215_093706-e7f0e23e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_40k_cityscapes/dmnet_r101-d8_769x769_40k_cityscapes_20201215_081348-a74261f6.pt | dmnet模型在开源社区上的dmnet_r101-d8_769x769_40k_cityscapes_20201215_081348-a74261f6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_80k_cityscapes/dmnet_r50-d8_512x1024_80k_cityscapes_20201215_053728-3c8893b9.pt | dmnet模型在开源社区上的dmnet_r50-d8_512x1024_80k_cityscapes_20201215_053728-3c8893b9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_80k_cityscapes/dmnet_r101-d8_512x1024_80k_cityscapes_20201215_031718-fa081cb8.pt | dmnet模型在开源社区上的dmnet_r101-d8_512x1024_80k_cityscapes_20201215_031718-fa081cb8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_80k_cityscapes/dmnet_r50-d8_769x769_80k_cityscapes_20201215_034006-6060840e.pt | dmnet模型在开源社区上的dmnet_r50-d8_769x769_80k_cityscapes_20201215_034006-6060840e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_80k_cityscapes/dmnet_r101-d8_769x769_80k_cityscapes_20201215_082810-7f0de59a.pt | dmnet模型在开源社区上的dmnet_r101-d8_769x769_80k_cityscapes_20201215_082810-7f0de59a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_80k_ade20k/dmnet_r50-d8_512x512_80k_ade20k_20201215_144744-f89092a6.pt | dmnet模型在开源社区上的dmnet_r50-d8_512x512_80k_ade20k_20201215_144744-f89092a6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_80k_ade20k/dmnet_r101-d8_512x512_80k_ade20k_20201215_104812-bfa45311.pt | dmnet模型在开源社区上的dmnet_r101-d8_512x512_80k_ade20k_20201215_104812-bfa45311.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_160k_ade20k/dmnet_r50-d8_512x512_160k_ade20k_20201215_115313-025ab3f9.pt | dmnet模型在开源社区上的dmnet_r50-d8_512x512_160k_ade20k_20201215_115313-025ab3f9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dmnet/README.md|MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_160k_ade20k/dmnet_r101-d8_512x512_160k_ade20k_20201215_111145-a0bc02ef.pt | dmnet模型在开源社区上的dmnet_r101-d8_512x512_160k_ade20k_20201215_111145-a0bc02ef.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://arxiv.org/abs/2006.0666 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/dnl_head.py#L8 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://github.com/yinmh17/DNL-Semantic-Segmentatio | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_40k_cityscapes/dnl_r50-d8_512x1024_40k_cityscapes_20200904_233629-53d4ea93.pt | dnlnet模型在开源社区上的dnl_r50-d8_512x1024_40k_cityscapes_20200904_233629-53d4ea93.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_40k_cityscapes/dnl_r101-d8_512x1024_40k_cityscapes_20200904_233629-9928ffef.pt | dnlnet模型在开源社区上的dnl_r101-d8_512x1024_40k_cityscapes_20200904_233629-9928ffef.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_40k_cityscapes/dnl_r50-d8_769x769_40k_cityscapes_20200820_232206-0f283785.pt | dnlnet模型在开源社区上的dnl_r50-d8_769x769_40k_cityscapes_20200820_232206-0f283785.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_40k_cityscapes/dnl_r101-d8_769x769_40k_cityscapes_20200820_171256-76c596df.pt | dnlnet模型在开源社区上的dnl_r101-d8_769x769_40k_cityscapes_20200820_171256-76c596df.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_80k_cityscapes/dnl_r50-d8_512x1024_80k_cityscapes_20200904_233629-58b2f778.pt | dnlnet模型在开源社区上的dnl_r50-d8_512x1024_80k_cityscapes_20200904_233629-58b2f778.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_80k_cityscapes/dnl_r101-d8_512x1024_80k_cityscapes_20200904_233629-758e2dd4.pt | dnlnet模型在开源社区上的dnl_r101-d8_512x1024_80k_cityscapes_20200904_233629-758e2dd4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_80k_cityscapes/dnl_r50-d8_769x769_80k_cityscapes_20200820_011925-366bc4c7.pt | dnlnet模型在开源社区上的dnl_r50-d8_769x769_80k_cityscapes_20200820_011925-366bc4c7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_80k_cityscapes/dnl_r101-d8_769x769_80k_cityscapes_20200821_051111-95ff84ab.pt | dnlnet模型在开源社区上的dnl_r101-d8_769x769_80k_cityscapes_20200821_051111-95ff84ab.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_80k_ade20k/dnl_r50-d8_512x512_80k_ade20k_20200826_183354-1cf6e0c1.pt | dnlnet模型在开源社区上的dnl_r50-d8_512x512_80k_ade20k_20200826_183354-1cf6e0c1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_80k_ade20k/dnl_r101-d8_512x512_80k_ade20k_20200826_183354-d820d6ea.pt | dnlnet模型在开源社区上的dnl_r101-d8_512x512_80k_ade20k_20200826_183354-d820d6ea.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_160k_ade20k/dnl_r50-d8_512x512_160k_ade20k_20200826_183350-37837798.pt | dnlnet模型在开源社区上的dnl_r50-d8_512x512_160k_ade20k_20200826_183350-37837798.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dnlnet/README.md|MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_160k_ade20k/dnl_r101-d8_512x512_160k_ade20k_20200826_183350-ed522c61.pt | dnlnet模型在开源社区上的dnl_r101-d8_512x512_160k_ade20k_20200826_183350-ed522c61.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dpt/README.md|MMseg-swin/configs/dpt/dpt.yml | https://arxiv.org/abs/2103.1341 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dpt/README.md|MMseg-swin/configs/dpt/dpt.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/dpt_head.py#L21 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dpt/README.md|MMseg-swin/configs/dpt/dpt.yml | https://github.com/isl-org/DP | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/dpt/README.md|MMseg-swin/configs/dpt/dpt.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dpt/dpt_vit-b16_512x512_160k_ade20k/dpt_vit-b16_512x512_160k_ade20k-db31cf52.pt | dpt模型在开源社区上的dpt_vit-b16_512x512_160k_ade20k-db31cf52.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://arxiv.org/abs/1907.1342 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/ema_head.py#L8 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://xialipku.github.io/EMANe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_512x1024_80k_cityscapes/emanet_r50-d8_512x1024_80k_cityscapes_20200901_100301-c43fcef1.pt | emanet模型在开源社区上的emanet_r50-d8_512x1024_80k_cityscapes_20200901_100301-c43fcef1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_512x1024_80k_cityscapes/emanet_r101-d8_512x1024_80k_cityscapes_20200901_100301-2d970745.pt | emanet模型在开源社区上的emanet_r101-d8_512x1024_80k_cityscapes_20200901_100301-2d970745.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_769x769_80k_cityscapes/emanet_r50-d8_769x769_80k_cityscapes_20200901_100301-16f8de52.pt | emanet模型在开源社区上的emanet_r50-d8_769x769_80k_cityscapes_20200901_100301-16f8de52.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/emanet/README.md|MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_769x769_80k_cityscapes/emanet_r101-d8_769x769_80k_cityscapes_20200901_100301-47a324ce.pt | emanet模型在开源社区上的emanet_r101-d8_769x769_80k_cityscapes_20200901_100301-47a324ce.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://arxiv.org/abs/1803.0890 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/enc_head.py#L6 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://github.com/zhanghang1989/PyTorch-Encodin | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_40k_cityscapes/encnet_r50-d8_512x1024_40k_cityscapes_20200621_220958-68638a47.pt | encnet模型在开源社区上的encnet_r50-d8_512x1024_40k_cityscapes_20200621_220958-68638a47.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_40k_cityscapes/encnet_r101-d8_512x1024_40k_cityscapes_20200621_220933-35e0a3e8.pt | encnet模型在开源社区上的encnet_r101-d8_512x1024_40k_cityscapes_20200621_220933-35e0a3e8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_40k_cityscapes/encnet_r50-d8_769x769_40k_cityscapes_20200621_220958-3bcd2884.pt | encnet模型在开源社区上的encnet_r50-d8_769x769_40k_cityscapes_20200621_220958-3bcd2884.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_40k_cityscapes/encnet_r101-d8_769x769_40k_cityscapes_20200621_220933-2fafed55.pt | encnet模型在开源社区上的encnet_r101-d8_769x769_40k_cityscapes_20200621_220933-2fafed55.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_80k_cityscapes/encnet_r50-d8_512x1024_80k_cityscapes_20200622_003554-fc5c5624.pt | encnet模型在开源社区上的encnet_r50-d8_512x1024_80k_cityscapes_20200622_003554-fc5c5624.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_80k_cityscapes/encnet_r101-d8_512x1024_80k_cityscapes_20200622_003555-1de64bec.pt | encnet模型在开源社区上的encnet_r101-d8_512x1024_80k_cityscapes_20200622_003555-1de64bec.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_80k_cityscapes/encnet_r50-d8_769x769_80k_cityscapes_20200622_003554-55096dcb.pt | encnet模型在开源社区上的encnet_r50-d8_769x769_80k_cityscapes_20200622_003554-55096dcb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_80k_cityscapes/encnet_r101-d8_769x769_80k_cityscapes_20200622_003555-470ef79d.pt | encnet模型在开源社区上的encnet_r101-d8_769x769_80k_cityscapes_20200622_003555-470ef79d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_80k_ade20k/encnet_r50-d8_512x512_80k_ade20k_20200622_042412-44b46b04.pt | encnet模型在开源社区上的encnet_r50-d8_512x512_80k_ade20k_20200622_042412-44b46b04.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_80k_ade20k/encnet_r101-d8_512x512_80k_ade20k_20200622_101128-dd35e237.pt | encnet模型在开源社区上的encnet_r101-d8_512x512_80k_ade20k_20200622_101128-dd35e237.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_160k_ade20k/encnet_r50-d8_512x512_160k_ade20k_20200622_101059-b2db95e0.pt | encnet模型在开源社区上的encnet_r50-d8_512x512_160k_ade20k_20200622_101059-b2db95e0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/encnet/README.md|MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_160k_ade20k/encnet_r101-d8_512x512_160k_ade20k_20200622_073348-7989641f.pt | encnet模型在开源社区上的encnet_r101-d8_512x512_160k_ade20k_20200622_073348-7989641f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/erfnet/README.md|MMseg-swin/configs/erfnet/erfnet.yml | http://www.robesafe.uah.es/personal/eduardo.romera/pdfs/Romera17tits.pd | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/erfnet/README.md|MMseg-swin/configs/erfnet/erfnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.20.0/mmseg/models/backbones/erfnet.py#L32 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/erfnet/README.md|MMseg-swin/configs/erfnet/erfnet.yml | https://github.com/Eromera/erfnet_pytorc | 源码下载链接| -| 开发引入 | / |MMseg-swin/configs/erfnet/erfnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/erfnet/erfnet_fcn_4x4_512x1024_160k_cityscapes/erfnet_fcn_4x4_512x1024_160k_cityscapes_20211126_082056-03d333ed.pt | erfnet模型在开源社区上的erfnet_fcn_4x4_512x1024_160k_cityscapes_20211126_082056-03d333ed.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://arxiv.org/abs/1903.1181 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.18.0/mmseg/models/necks/jpu.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://github.com/wuhuikai/FastFC | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes_20210928_053722-5d1a2648.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes_20210928_053722-5d1a2648.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes_20210924_214357-72220849.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes_20210924_214357-72220849.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes_20210928_053722-57749bed.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes_20210928_053722-57749bed.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes_20210925_061841-77e87b0a.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes_20210925_061841-77e87b0a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes_20210928_030036-78da5046.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes_20210928_030036-78da5046.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes_20210926_093217-e1eb6dbb.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes_20210926_093217-e1eb6dbb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k/fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k_20211013_190619-3aa40f2d.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k_20211013_190619-3aa40f2d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k/fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k_20211008_152246-27036aee.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k_20211008_152246-27036aee.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k/fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k_20210930_225137-993d07c8.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k_20210930_225137-993d07c8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k/fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k_20211008_105455-e8f5a2fd.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k_20211008_105455-e8f5a2fd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k/fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k_20210930_225214-65aef6dd.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k_20210930_225214-65aef6dd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastfcn/README.md|MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k/fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k_20211008_105456-d875ce3c.pt | fastfcn模型在开源社区上的fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k_20211008_105456-d875ce3c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastscnn/README.md|MMseg-swin/configs/fastscnn/fastscnn.yml | https://arxiv.org/abs/1902.0450 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastscnn/README.md|MMseg-swin/configs/fastscnn/fastscnn.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/backbones/fast_scnn.py#L27 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fastscnn/README.md|MMseg-swin/configs/fastscnn/fastscnn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fast_scnn/fast_scnn_lr0.12_8x4_160k_cityscapes/fast_scnn_lr0.12_8x4_160k_cityscapes_20210630_164853-0cec9937.pt | fast_scnn模型在开源社区上的fast_scnn_lr0.12_8x4_160k_cityscapes_20210630_164853-0cec9937.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://arxiv.org/abs/1411.403 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/fcn_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://github.com/BVLC/caffe/wiki/Model-Zoo#fc | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/projects/example_project/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_40k_cityscapes/fcn_r50-d8_512x1024_40k_cityscapes_20200604_192608-efe53f0d.pt | fcn模型在开源社区上的fcn_r50-d8_512x1024_40k_cityscapes_20200604_192608-efe53f0d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_40k_cityscapes/fcn_r101-d8_512x1024_40k_cityscapes_20200604_181852-a883d3a1.pt | fcn模型在开源社区上的fcn_r101-d8_512x1024_40k_cityscapes_20200604_181852-a883d3a1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_40k_cityscapes/fcn_r50-d8_769x769_40k_cityscapes_20200606_113104-977b5d02.pt | fcn模型在开源社区上的fcn_r50-d8_769x769_40k_cityscapes_20200606_113104-977b5d02.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_40k_cityscapes/fcn_r101-d8_769x769_40k_cityscapes_20200606_113208-7d4ab69c.pt | fcn模型在开源社区上的fcn_r101-d8_769x769_40k_cityscapes_20200606_113208-7d4ab69c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_512x1024_80k_cityscapes/fcn_r18-d8_512x1024_80k_cityscapes_20201225_021327-6c50f8b4.pt | fcn模型在开源社区上的fcn_r18-d8_512x1024_80k_cityscapes_20201225_021327-6c50f8b4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_80k_cityscapes/fcn_r50-d8_512x1024_80k_cityscapes_20200606_113019-03aa804d.pt | fcn模型在开源社区上的fcn_r50-d8_512x1024_80k_cityscapes_20200606_113019-03aa804d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_80k_cityscapes/fcn_r101-d8_512x1024_80k_cityscapes_20200606_113038-3fb937eb.pt | fcn模型在开源社区上的fcn_r101-d8_512x1024_80k_cityscapes_20200606_113038-3fb937eb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_fp16_512x1024_80k_cityscapes/fcn_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230921-fb13e883.pt | fcn模型在开源社区上的fcn_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230921-fb13e883.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_769x769_80k_cityscapes/fcn_r18-d8_769x769_80k_cityscapes_20201225_021451-9739d1b8.pt | fcn模型在开源社区上的fcn_r18-d8_769x769_80k_cityscapes_20201225_021451-9739d1b8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_80k_cityscapes/fcn_r50-d8_769x769_80k_cityscapes_20200606_195749-f5caeabc.pt | fcn模型在开源社区上的fcn_r50-d8_769x769_80k_cityscapes_20200606_195749-f5caeabc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_80k_cityscapes/fcn_r101-d8_769x769_80k_cityscapes_20200606_214354-45cbac68.pt | fcn模型在开源社区上的fcn_r101-d8_769x769_80k_cityscapes_20200606_214354-45cbac68.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_512x1024_80k_cityscapes/fcn_r18b-d8_512x1024_80k_cityscapes_20201225_230143-92c0f445.pt | fcn模型在开源社区上的fcn_r18b-d8_512x1024_80k_cityscapes_20201225_230143-92c0f445.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_512x1024_80k_cityscapes/fcn_r50b-d8_512x1024_80k_cityscapes_20201225_094221-82957416.pt | fcn模型在开源社区上的fcn_r50b-d8_512x1024_80k_cityscapes_20201225_094221-82957416.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_512x1024_80k_cityscapes/fcn_r101b-d8_512x1024_80k_cityscapes_20201226_160213-4543858f.pt | fcn模型在开源社区上的fcn_r101b-d8_512x1024_80k_cityscapes_20201226_160213-4543858f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_769x769_80k_cityscapes/fcn_r18b-d8_769x769_80k_cityscapes_20201226_004430-32d504e5.pt | fcn模型在开源社区上的fcn_r18b-d8_769x769_80k_cityscapes_20201226_004430-32d504e5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_769x769_80k_cityscapes/fcn_r50b-d8_769x769_80k_cityscapes_20201225_094223-94552d38.pt | fcn模型在开源社区上的fcn_r50b-d8_769x769_80k_cityscapes_20201225_094223-94552d38.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_769x769_80k_cityscapes/fcn_r101b-d8_769x769_80k_cityscapes_20201226_170012-82be37e2.pt | fcn模型在开源社区上的fcn_r101b-d8_769x769_80k_cityscapes_20201226_170012-82be37e2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_40k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes_20210305_130133-98d5d1bc.pt | fcn模型在开源社区上的fcn_d6_r50-d16_512x1024_40k_cityscapes_20210305_130133-98d5d1bc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_80k_cityscapes/fcn_d6_r50-d16_512x1024_80k_cityscapes_20210306_115604-133c292f.pt | fcn模型在开源社区上的fcn_d6_r50-d16_512x1024_80k_cityscapes_20210306_115604-133c292f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_40k_cityscapes/fcn_d6_r50-d16_769x769_40k_cityscapes_20210305_185744-1aab18ed.pt | fcn模型在开源社区上的fcn_d6_r50-d16_769x769_40k_cityscapes_20210305_185744-1aab18ed.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_80k_cityscapes/fcn_d6_r50-d16_769x769_80k_cityscapes_20210305_200413-109d88eb.pt | fcn模型在开源社区上的fcn_d6_r50-d16_769x769_80k_cityscapes_20210305_200413-109d88eb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_40k_cityscapes/fcn_d6_r101-d16_512x1024_40k_cityscapes_20210305_130337-9cf2b450.pt | fcn模型在开源社区上的fcn_d6_r101-d16_512x1024_40k_cityscapes_20210305_130337-9cf2b450.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_80k_cityscapes/fcn_d6_r101-d16_512x1024_80k_cityscapes_20210308_102747-cb336445.pt | fcn模型在开源社区上的fcn_d6_r101-d16_512x1024_80k_cityscapes_20210308_102747-cb336445.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_40k_cityscapes/fcn_d6_r101-d16_769x769_40k_cityscapes_20210308_102453-60b114e9.pt | fcn模型在开源社区上的fcn_d6_r101-d16_769x769_40k_cityscapes_20210308_102453-60b114e9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_80k_cityscapes/fcn_d6_r101-d16_769x769_80k_cityscapes_20210306_120016-e33adc4f.pt | fcn模型在开源社区上的fcn_d6_r101-d16_769x769_80k_cityscapes_20210306_120016-e33adc4f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50b-d16_512x1024_80k_cityscapes/fcn_d6_r50b-d16_512x1024_80k_cityscapes_20210311_125550-6a0b62e9.pt | fcn模型在开源社区上的fcn_d6_r50b-d16_512x1024_80k_cityscapes_20210311_125550-6a0b62e9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50b-d16_769x769_80k_cityscapes/fcn_d6_r50b-d16_769x769_80k_cityscapes_20210311_131012-d665f231.pt | fcn模型在开源社区上的fcn_d6_r50b-d16_769x769_80k_cityscapes_20210311_131012-d665f231.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101b-d16_512x1024_80k_cityscapes/fcn_d6_r101b-d16_512x1024_80k_cityscapes_20210311_144305-3f2eb5b4.pt | fcn模型在开源社区上的fcn_d6_r101b-d16_512x1024_80k_cityscapes_20210311_144305-3f2eb5b4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101b-d16_769x769_80k_cityscapes/fcn_d6_r101b-d16_769x769_80k_cityscapes_20210311_154527-c4d8bfbc.pt | fcn模型在开源社区上的fcn_d6_r101b-d16_769x769_80k_cityscapes_20210311_154527-c4d8bfbc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_80k_ade20k/fcn_r50-d8_512x512_80k_ade20k_20200614_144016-f8ac5082.pt | fcn模型在开源社区上的fcn_r50-d8_512x512_80k_ade20k_20200614_144016-f8ac5082.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_80k_ade20k/fcn_r101-d8_512x512_80k_ade20k_20200615_014143-bc1809f7.pt | fcn模型在开源社区上的fcn_r101-d8_512x512_80k_ade20k_20200615_014143-bc1809f7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_160k_ade20k/fcn_r50-d8_512x512_160k_ade20k_20200615_100713-4edbc3b4.pt | fcn模型在开源社区上的fcn_r50-d8_512x512_160k_ade20k_20200615_100713-4edbc3b4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_160k_ade20k/fcn_r101-d8_512x512_160k_ade20k_20200615_105816-fd192bd5.pt | fcn模型在开源社区上的fcn_r101-d8_512x512_160k_ade20k_20200615_105816-fd192bd5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_20k_voc12aug/fcn_r50-d8_512x512_20k_voc12aug_20200617_010715-52dc5306.pt | fcn模型在开源社区上的fcn_r50-d8_512x512_20k_voc12aug_20200617_010715-52dc5306.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_20k_voc12aug/fcn_r101-d8_512x512_20k_voc12aug_20200617_010842-0bb4e798.pt | fcn模型在开源社区上的fcn_r101-d8_512x512_20k_voc12aug_20200617_010842-0bb4e798.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_40k_voc12aug/fcn_r50-d8_512x512_40k_voc12aug_20200613_161222-5e2dbf40.pt | fcn模型在开源社区上的fcn_r50-d8_512x512_40k_voc12aug_20200613_161222-5e2dbf40.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_40k_voc12aug/fcn_r101-d8_512x512_40k_voc12aug_20200613_161240-4c8bcefd.pt | fcn模型在开源社区上的fcn_r101-d8_512x512_40k_voc12aug_20200613_161240-4c8bcefd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context/fcn_r101-d8_480x480_40k_pascal_context_20210421_154757-b5e97937.pt | fcn模型在开源社区上的fcn_r101-d8_480x480_40k_pascal_context_20210421_154757-b5e97937.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context/fcn_r101-d8_480x480_80k_pascal_context_20210421_163310-4711813f.pt | fcn模型在开源社区上的fcn_r101-d8_480x480_80k_pascal_context_20210421_163310-4711813f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context_59/fcn_r101-d8_480x480_40k_pascal_context_59_20210415_230724-8cf83682.pt | fcn模型在开源社区上的fcn_r101-d8_480x480_40k_pascal_context_59_20210415_230724-8cf83682.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/fcn/README.md|MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context_59/fcn_r101-d8_480x480_80k_pascal_context_59_20210416_110804-9a6f2c94.pt | fcn模型在开源社区上的fcn_r101-d8_480x480_80k_pascal_context_59_20210416_110804-9a6f2c94.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://arxiv.org/abs/1904.1149 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/gc_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://github.com/xvjiarui/GCNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_40k_cityscapes/gcnet_r50-d8_512x1024_40k_cityscapes_20200618_074436-4b0fd17b.pt | gcnet模型在开源社区上的gcnet_r50-d8_512x1024_40k_cityscapes_20200618_074436-4b0fd17b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_40k_cityscapes/gcnet_r101-d8_512x1024_40k_cityscapes_20200618_074436-5e62567f.pt | gcnet模型在开源社区上的gcnet_r101-d8_512x1024_40k_cityscapes_20200618_074436-5e62567f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_40k_cityscapes/gcnet_r50-d8_769x769_40k_cityscapes_20200618_182814-a26f4471.pt | gcnet模型在开源社区上的gcnet_r50-d8_769x769_40k_cityscapes_20200618_182814-a26f4471.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_40k_cityscapes/gcnet_r101-d8_769x769_40k_cityscapes_20200619_092550-ca4f0a84.pt | gcnet模型在开源社区上的gcnet_r101-d8_769x769_40k_cityscapes_20200619_092550-ca4f0a84.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes/gcnet_r50-d8_512x1024_80k_cityscapes_20200618_074450-ef8f069b.pt | gcnet模型在开源社区上的gcnet_r50-d8_512x1024_80k_cityscapes_20200618_074450-ef8f069b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_80k_cityscapes/gcnet_r101-d8_512x1024_80k_cityscapes_20200618_074450-778ebf69.pt | gcnet模型在开源社区上的gcnet_r101-d8_512x1024_80k_cityscapes_20200618_074450-778ebf69.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_80k_cityscapes/gcnet_r50-d8_769x769_80k_cityscapes_20200619_092516-4839565b.pt | gcnet模型在开源社区上的gcnet_r50-d8_769x769_80k_cityscapes_20200619_092516-4839565b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_80k_cityscapes/gcnet_r101-d8_769x769_80k_cityscapes_20200619_092628-8e043423.pt | gcnet模型在开源社区上的gcnet_r101-d8_769x769_80k_cityscapes_20200619_092628-8e043423.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_80k_ade20k/gcnet_r50-d8_512x512_80k_ade20k_20200614_185146-91a6da41.pt | gcnet模型在开源社区上的gcnet_r50-d8_512x512_80k_ade20k_20200614_185146-91a6da41.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_80k_ade20k/gcnet_r101-d8_512x512_80k_ade20k_20200615_020811-c3fcb6dd.pt | gcnet模型在开源社区上的gcnet_r101-d8_512x512_80k_ade20k_20200615_020811-c3fcb6dd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_160k_ade20k/gcnet_r50-d8_512x512_160k_ade20k_20200615_224122-d95f3e1f.pt | gcnet模型在开源社区上的gcnet_r50-d8_512x512_160k_ade20k_20200615_224122-d95f3e1f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_160k_ade20k/gcnet_r101-d8_512x512_160k_ade20k_20200615_225406-615528d7.pt | gcnet模型在开源社区上的gcnet_r101-d8_512x512_160k_ade20k_20200615_225406-615528d7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_20k_voc12aug/gcnet_r50-d8_512x512_20k_voc12aug_20200617_165701-3cbfdab1.pt | gcnet模型在开源社区上的gcnet_r50-d8_512x512_20k_voc12aug_20200617_165701-3cbfdab1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_20k_voc12aug/gcnet_r101-d8_512x512_20k_voc12aug_20200617_165713-6c720aa9.pt | gcnet模型在开源社区上的gcnet_r101-d8_512x512_20k_voc12aug_20200617_165713-6c720aa9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_40k_voc12aug/gcnet_r50-d8_512x512_40k_voc12aug_20200613_195105-9797336d.pt | gcnet模型在开源社区上的gcnet_r50-d8_512x512_40k_voc12aug_20200613_195105-9797336d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/gcnet/README.md|MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_40k_voc12aug/gcnet_r101-d8_512x512_40k_voc12aug_20200613_185806-1e38208d.pt | gcnet模型在开源社区上的gcnet_r101-d8_512x512_40k_voc12aug_20200613_185806-1e38208d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_40k_cityscapes/fcn_hr18s_512x1024_40k_cityscapes_20200601_014216-93db27d0.pt | hrnet模型在开源社区上的fcn_hr18s_512x1024_40k_cityscapes_20200601_014216-93db27d0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_40k_cityscapes/fcn_hr18_512x1024_40k_cityscapes_20200601_014216-f196fb4e.pt | hrnet模型在开源社区上的fcn_hr18_512x1024_40k_cityscapes_20200601_014216-f196fb4e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_40k_cityscapes/fcn_hr48_512x1024_40k_cityscapes_20200601_014240-a989b146.pt | hrnet模型在开源社区上的fcn_hr48_512x1024_40k_cityscapes_20200601_014240-a989b146.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_80k_cityscapes/fcn_hr18s_512x1024_80k_cityscapes_20200601_202700-1462b75d.pt | hrnet模型在开源社区上的fcn_hr18s_512x1024_80k_cityscapes_20200601_202700-1462b75d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_80k_cityscapes/fcn_hr18_512x1024_80k_cityscapes_20200601_223255-4e7b345e.pt | hrnet模型在开源社区上的fcn_hr18_512x1024_80k_cityscapes_20200601_223255-4e7b345e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_80k_cityscapes/fcn_hr48_512x1024_80k_cityscapes_20200601_202606-58ea95d6.pt | hrnet模型在开源社区上的fcn_hr48_512x1024_80k_cityscapes_20200601_202606-58ea95d6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_160k_cityscapes/fcn_hr18s_512x1024_160k_cityscapes_20200602_190901-4a0797ea.pt | hrnet模型在开源社区上的fcn_hr18s_512x1024_160k_cityscapes_20200602_190901-4a0797ea.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_160k_cityscapes/fcn_hr18_512x1024_160k_cityscapes_20200602_190822-221e4a4f.pt | hrnet模型在开源社区上的fcn_hr18_512x1024_160k_cityscapes_20200602_190822-221e4a4f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_160k_cityscapes/fcn_hr48_512x1024_160k_cityscapes_20200602_190946-59b7973e.pt | hrnet模型在开源社区上的fcn_hr48_512x1024_160k_cityscapes_20200602_190946-59b7973e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_ade20k/fcn_hr18s_512x512_80k_ade20k_20200614_144345-77fc814a.pt | hrnet模型在开源社区上的fcn_hr18s_512x512_80k_ade20k_20200614_144345-77fc814a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_ade20k/fcn_hr18_512x512_80k_ade20k_20210827_114910-6c9382c0.pt | hrnet模型在开源社区上的fcn_hr18_512x512_80k_ade20k_20210827_114910-6c9382c0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_ade20k/fcn_hr48_512x512_80k_ade20k_20200614_193946-7ba5258d.pt | hrnet模型在开源社区上的fcn_hr48_512x512_80k_ade20k_20200614_193946-7ba5258d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_160k_ade20k/fcn_hr18s_512x512_160k_ade20k_20210829_174739-f1e7c2e7.pt | hrnet模型在开源社区上的fcn_hr18s_512x512_160k_ade20k_20210829_174739-f1e7c2e7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_160k_ade20k/fcn_hr18_512x512_160k_ade20k_20200614_214426-ca961836.pt | hrnet模型在开源社区上的fcn_hr18_512x512_160k_ade20k_20200614_214426-ca961836.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_160k_ade20k/fcn_hr48_512x512_160k_ade20k_20200614_214407-a52fc02c.pt | hrnet模型在开源社区上的fcn_hr48_512x512_160k_ade20k_20200614_214407-a52fc02c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_20k_voc12aug/fcn_hr18s_512x512_20k_voc12aug_20210829_174910-0aceadb4.pt | hrnet模型在开源社区上的fcn_hr18s_512x512_20k_voc12aug_20210829_174910-0aceadb4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_20k_voc12aug/fcn_hr18_512x512_20k_voc12aug_20200617_224503-488d45f7.pt | hrnet模型在开源社区上的fcn_hr18_512x512_20k_voc12aug_20200617_224503-488d45f7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_20k_voc12aug/fcn_hr48_512x512_20k_voc12aug_20200617_224419-89de05cd.pt | hrnet模型在开源社区上的fcn_hr48_512x512_20k_voc12aug_20200617_224419-89de05cd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_40k_voc12aug/fcn_hr18s_512x512_40k_voc12aug_20200614_000648-4f8d6e7f.pt | hrnet模型在开源社区上的fcn_hr18s_512x512_40k_voc12aug_20200614_000648-4f8d6e7f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_40k_voc12aug/fcn_hr18_512x512_40k_voc12aug_20200613_224401-1b4b76cd.pt | hrnet模型在开源社区上的fcn_hr18_512x512_40k_voc12aug_20200613_224401-1b4b76cd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_40k_voc12aug/fcn_hr48_512x512_40k_voc12aug_20200613_222111-1b0f18bc.pt | hrnet模型在开源社区上的fcn_hr48_512x512_40k_voc12aug_20200613_222111-1b0f18bc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context/fcn_hr48_480x480_40k_pascal_context_20200911_164852-667d00b0.pt | hrnet模型在开源社区上的fcn_hr48_480x480_40k_pascal_context_20200911_164852-667d00b0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context/fcn_hr48_480x480_80k_pascal_context_20200911_155322-847a6711.pt | hrnet模型在开源社区上的fcn_hr48_480x480_80k_pascal_context_20200911_155322-847a6711.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context_59/fcn_hr48_480x480_40k_pascal_context_59_20210410_122738-b808b8b2.pt | hrnet模型在开源社区上的fcn_hr48_480x480_40k_pascal_context_59_20210410_122738-b808b8b2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context_59/fcn_hr48_480x480_80k_pascal_context_59_20210411_003240-3ae7081e.pt | hrnet模型在开源社区上的fcn_hr48_480x480_80k_pascal_context_59_20210411_003240-3ae7081e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_loveda/fcn_hr18s_512x512_80k_loveda_20211210_203228-60a86a7a.pt | hrnet模型在开源社区上的fcn_hr18s_512x512_80k_loveda_20211210_203228-60a86a7a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_loveda/fcn_hr18_512x512_80k_loveda_20211210_203952-93d9c3b3.pt | hrnet模型在开源社区上的fcn_hr18_512x512_80k_loveda_20211210_203952-93d9c3b3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_loveda/fcn_hr48_512x512_80k_loveda_20211211_044756-67072f55.pt | hrnet模型在开源社区上的fcn_hr48_512x512_80k_loveda_20211211_044756-67072f55.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_potsdam/fcn_hr18s_512x512_80k_potsdam_20211218_205517-ba32af63.pt | hrnet模型在开源社区上的fcn_hr18s_512x512_80k_potsdam_20211218_205517-ba32af63.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_potsdam/fcn_hr18_512x512_80k_potsdam_20211218_205517-5d0387ad.pt | hrnet模型在开源社区上的fcn_hr18_512x512_80k_potsdam_20211218_205517-5d0387ad.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_potsdam/fcn_hr48_512x512_80k_potsdam_20211219_020601-97434c78.pt | hrnet模型在开源社区上的fcn_hr48_512x512_80k_potsdam_20211219_020601-97434c78.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_4x4_512x512_80k_vaihingen/fcn_hr18s_4x4_512x512_80k_vaihingen_20211231_230909-b23aae02.pt | hrnet模型在开源社区上的fcn_hr18s_4x4_512x512_80k_vaihingen_20211231_230909-b23aae02.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_4x4_512x512_80k_vaihingen/fcn_hr18_4x4_512x512_80k_vaihingen_20211231_231216-2ec3ae8a.pt | hrnet模型在开源社区上的fcn_hr18_4x4_512x512_80k_vaihingen_20211231_231216-2ec3ae8a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_4x4_512x512_80k_vaihingen/fcn_hr48_4x4_512x512_80k_vaihingen_20211231_231244-7133cb22.pt | hrnet模型在开源社区上的fcn_hr48_4x4_512x512_80k_vaihingen_20211231_231244-7133cb22.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_4x4_896x896_80k_isaid/fcn_hr18s_4x4_896x896_80k_isaid_20220118_001603-3cc0769b.pt | hrnet模型在开源社区上的fcn_hr18s_4x4_896x896_80k_isaid_20220118_001603-3cc0769b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_4x4_896x896_80k_isaid/fcn_hr18_4x4_896x896_80k_isaid_20220110_182230-49bf752e.pt | hrnet模型在开源社区上的fcn_hr18_4x4_896x896_80k_isaid_20220110_182230-49bf752e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/hrnet/README.md|MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_4x4_896x896_80k_isaid/fcn_hr48_4x4_896x896_80k_isaid_20220114_174643-547fc420.pt | hrnet模型在开源社区上的fcn_hr48_4x4_896x896_80k_isaid_20220114_174643-547fc420.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://arxiv.org/abs/1704.0854 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.18.0/mmseg/models/necks/ic_neck.py#L7 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://github.com/hszhao/ICNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_832x832_80k_cityscapes/icnet_r18-d8_832x832_80k_cityscapes_20210925_225521-2e36638d.pt | icnet模型在开源社区上的icnet_r18-d8_832x832_80k_cityscapes_20210925_225521-2e36638d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_832x832_160k_cityscapes/icnet_r18-d8_832x832_160k_cityscapes_20210925_230153-2c6eb6e0.pt | icnet模型在开源社区上的icnet_r18-d8_832x832_160k_cityscapes_20210925_230153-2c6eb6e0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_in1k-pre_832x832_80k_cityscapes/icnet_r18-d8_in1k-pre_832x832_80k_cityscapes_20210925_230354-1cbe3022.pt | icnet模型在开源社区上的icnet_r18-d8_in1k-pre_832x832_80k_cityscapes_20210925_230354-1cbe3022.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_in1k-pre_832x832_160k_cityscapes/icnet_r18-d8_in1k-pre_832x832_160k_cityscapes_20210926_052702-619c8ae1.pt | icnet模型在开源社区上的icnet_r18-d8_in1k-pre_832x832_160k_cityscapes_20210926_052702-619c8ae1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_832x832_80k_cityscapes/icnet_r50-d8_832x832_80k_cityscapes_20210926_044625-c6407341.pt | icnet模型在开源社区上的icnet_r50-d8_832x832_80k_cityscapes_20210926_044625-c6407341.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_832x832_160k_cityscapes/icnet_r50-d8_832x832_160k_cityscapes_20210925_232612-a95f0d4e.pt | icnet模型在开源社区上的icnet_r50-d8_832x832_160k_cityscapes_20210925_232612-a95f0d4e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_in1k-pre_832x832_80k_cityscapes/icnet_r50-d8_in1k-pre_832x832_80k_cityscapes_20210926_032943-1743dc7b.pt | icnet模型在开源社区上的icnet_r50-d8_in1k-pre_832x832_80k_cityscapes_20210926_032943-1743dc7b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_in1k-pre_832x832_160k_cityscapes/icnet_r50-d8_in1k-pre_832x832_160k_cityscapes_20210926_042715-ce310aea.pt | icnet模型在开源社区上的icnet_r50-d8_in1k-pre_832x832_160k_cityscapes_20210926_042715-ce310aea.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_832x832_80k_cityscapes/icnet_r101-d8_832x832_80k_cityscapes_20210926_072447-b52f936e.pt | icnet模型在开源社区上的icnet_r101-d8_832x832_80k_cityscapes_20210926_072447-b52f936e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_832x832_160k_cityscapes/icnet_r101-d8_832x832_160k_cityscapes_20210926_092350-3a1ebf1a.pt | icnet模型在开源社区上的icnet_r101-d8_832x832_160k_cityscapes_20210926_092350-3a1ebf1a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_in1k-pre_832x832_80k_cityscapes/icnet_r101-d8_in1k-pre_832x832_80k_cityscapes_20210926_020414-7ceb12c5.pt | icnet模型在开源社区上的icnet_r101-d8_in1k-pre_832x832_80k_cityscapes_20210926_020414-7ceb12c5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/icnet/README.md|MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_in1k-pre_832x832_160k_cityscapes/icnet_r101-d8_in1k-pre_832x832_160k_cityscapes_20210925_232612-9484ae8a.pt | icnet模型在开源社区上的icnet_r101-d8_in1k-pre_832x832_160k_cityscapes_20210925_232612-9484ae8a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://arxiv.org/abs/1907.1227 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.18.0/mmseg/models/decode_heads/isa_head.py#L5 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://github.com/openseg-group/openseg.pytorc | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x1024_40k_cityscapes/isanet_r50-d8_512x1024_40k_cityscapes_20210901_054739-981bd763.pt | isanet模型在开源社区上的isanet_r50-d8_512x1024_40k_cityscapes_20210901_054739-981bd763.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x1024_80k_cityscapes/isanet_r50-d8_512x1024_80k_cityscapes_20210901_074202-89384497.pt | isanet模型在开源社区上的isanet_r50-d8_512x1024_80k_cityscapes_20210901_074202-89384497.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_769x769_40k_cityscapes/isanet_r50-d8_769x769_40k_cityscapes_20210903_050200-4ae7e65b.pt | isanet模型在开源社区上的isanet_r50-d8_769x769_40k_cityscapes_20210903_050200-4ae7e65b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_769x769_80k_cityscapes/isanet_r50-d8_769x769_80k_cityscapes_20210903_101126-99b54519.pt | isanet模型在开源社区上的isanet_r50-d8_769x769_80k_cityscapes_20210903_101126-99b54519.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x1024_40k_cityscapes/isanet_r101-d8_512x1024_40k_cityscapes_20210901_145553-293e6bd6.pt | isanet模型在开源社区上的isanet_r101-d8_512x1024_40k_cityscapes_20210901_145553-293e6bd6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x1024_80k_cityscapes/isanet_r101-d8_512x1024_80k_cityscapes_20210901_145243-5b99c9b2.pt | isanet模型在开源社区上的isanet_r101-d8_512x1024_80k_cityscapes_20210901_145243-5b99c9b2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_769x769_40k_cityscapes/isanet_r101-d8_769x769_40k_cityscapes_20210903_111320-509e7224.pt | isanet模型在开源社区上的isanet_r101-d8_769x769_40k_cityscapes_20210903_111320-509e7224.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_769x769_80k_cityscapes/isanet_r101-d8_769x769_80k_cityscapes_20210903_111319-24f71dfa.pt | isanet模型在开源社区上的isanet_r101-d8_769x769_80k_cityscapes_20210903_111319-24f71dfa.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_80k_ade20k/isanet_r50-d8_512x512_80k_ade20k_20210903_124557-6ed83a0c.pt | isanet模型在开源社区上的isanet_r50-d8_512x512_80k_ade20k_20210903_124557-6ed83a0c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_160k_ade20k/isanet_r50-d8_512x512_160k_ade20k_20210903_104850-f752d0a3.pt | isanet模型在开源社区上的isanet_r50-d8_512x512_160k_ade20k_20210903_104850-f752d0a3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_80k_ade20k/isanet_r101-d8_512x512_80k_ade20k_20210903_162056-68b235c2.pt | isanet模型在开源社区上的isanet_r101-d8_512x512_80k_ade20k_20210903_162056-68b235c2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_160k_ade20k/isanet_r101-d8_512x512_160k_ade20k_20210903_211431-a7879dcd.pt | isanet模型在开源社区上的isanet_r101-d8_512x512_160k_ade20k_20210903_211431-a7879dcd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_20k_voc12aug/isanet_r50-d8_512x512_20k_voc12aug_20210901_164838-79d59b80.pt | isanet模型在开源社区上的isanet_r50-d8_512x512_20k_voc12aug_20210901_164838-79d59b80.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_40k_voc12aug/isanet_r50-d8_512x512_40k_voc12aug_20210901_151349-7d08a54e.pt | isanet模型在开源社区上的isanet_r50-d8_512x512_40k_voc12aug_20210901_151349-7d08a54e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_20k_voc12aug/isanet_r101-d8_512x512_20k_voc12aug_20210901_115805-3ccbf355.pt | isanet模型在开源社区上的isanet_r101-d8_512x512_20k_voc12aug_20210901_115805-3ccbf355.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/isanet/README.md|MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_40k_voc12aug/isanet_r101-d8_512x512_40k_voc12aug_20210901_145814-bc71233b.pt | isanet模型在开源社区上的isanet_r101-d8_512x512_40k_voc12aug_20210901_145814-bc71233b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://arxiv.org/abs/2106.1485 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.23.0/mmseg/models/decode_heads/knet_head.py#L39 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://github.com/ZwwWayne/K-Net | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_043751-abcab920.pt | knet模型在开源社区上的knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_043751-abcab920.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_054634-d2c72240.pt | knet模型在开源社区上的knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_054634-d2c72240.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_041642-00c8fbeb.pt | knet模型在开源社区上的knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_041642-00c8fbeb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220304_125657-215753b0.pt | knet模型在开源社区上的knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220304_125657-215753b0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k_20220303_133059-7545e1dc.pt | knet模型在开源社区上的knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k_20220303_133059-7545e1dc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k_20220303_154559-d8da9a90.pt | knet模型在开源社区上的knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k_20220303_154559-d8da9a90.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/knet/README.md|MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k_20220301_220747-8787fc71.pt | knet模型在开源社区上的knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k_20220301_220747-8787fc71.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mae/README.md|MMseg-swin/configs/mae/mae.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mae/upernet_mae-base_fp16_8x2_512x512_160k_ade20k/upernet_mae-base_fp16_8x2_512x512_160k_ade20k_20220426_174752-f92a2975.pt | mae模型在开源社区上的upernet_mae-base_fp16_8x2_512x512_160k_ade20k_20220426_174752-f92a2975.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x1024_80k_cityscapes/fcn_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-d24c28c1.pt | mobilenet_v2模型在开源社区上的fcn_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-d24c28c1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x1024_80k_cityscapes/pspnet_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-19e81d51.pt | mobilenet_v2模型在开源社区上的pspnet_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-19e81d51.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x1024_80k_cityscapes/deeplabv3_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-bef03590.pt | mobilenet_v2模型在开源社区上的deeplabv3_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-bef03590.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-d256dd4b.pt | mobilenet_v2模型在开源社区上的deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-d256dd4b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x512_160k_ade20k/fcn_m-v2-d8_512x512_160k_ade20k_20200825_214953-c40e1095.pt | mobilenet_v2模型在开源社区上的fcn_m-v2-d8_512x512_160k_ade20k_20200825_214953-c40e1095.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x512_160k_ade20k/pspnet_m-v2-d8_512x512_160k_ade20k_20200825_214953-f5942f7a.pt | mobilenet_v2模型在开源社区上的pspnet_m-v2-d8_512x512_160k_ade20k_20200825_214953-f5942f7a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x512_160k_ade20k/deeplabv3_m-v2-d8_512x512_160k_ade20k_20200825_223255-63986343.pt | mobilenet_v2模型在开源社区上的deeplabv3_m-v2-d8_512x512_160k_ade20k_20200825_223255-63986343.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x512_160k_ade20k/deeplabv3plus_m-v2-d8_512x512_160k_ade20k_20200825_223255-465a01d4.pt | mobilenet_v2模型在开源社区上的deeplabv3plus_m-v2-d8_512x512_160k_ade20k_20200825_223255-465a01d4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v3/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://arxiv.org/abs/1905.0224 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v3/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/backbones/mobilenet_v3.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v2/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://github.com/tensorflow/models/tree/master/research/deepla | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v3/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_512x1024_320k_cityscapes/lraspp_m-v3-d8_512x1024_320k_cityscapes_20201224_220337-cfe8fb07.pt | mobilenet_v3模型在开源社区上的lraspp_m-v3-d8_512x1024_320k_cityscapes_20201224_220337-cfe8fb07.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v3/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes_20201224_220337-9f29cd72.pt | mobilenet_v3模型在开源社区上的lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes_20201224_220337-9f29cd72.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v3/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes/lraspp_m-v3s-d8_512x1024_320k_cityscapes_20201224_223935-61565b34.pt | mobilenet_v3模型在开源社区上的lraspp_m-v3s-d8_512x1024_320k_cityscapes_20201224_223935-61565b34.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/mobilenet_v3/README.md|MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes_20201224_223935-03daeabb.pt | mobilenet_v3模型在开源社区上的lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes_20201224_223935-03daeabb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://arxiv.org/abs/1711.0797 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/nl_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://github.com/facebookresearch/video-nonlocal-ne | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_40k_cityscapes/nonlocal_r50-d8_512x1024_40k_cityscapes_20200605_210748-c75e81e3.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_512x1024_40k_cityscapes_20200605_210748-c75e81e3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_40k_cityscapes/nonlocal_r101-d8_512x1024_40k_cityscapes_20200605_210748-d63729fa.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_512x1024_40k_cityscapes_20200605_210748-d63729fa.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_40k_cityscapes/nonlocal_r50-d8_769x769_40k_cityscapes_20200530_045243-82ef6749.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_769x769_40k_cityscapes_20200530_045243-82ef6749.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_40k_cityscapes/nonlocal_r101-d8_769x769_40k_cityscapes_20200530_045348-8fe9a9dc.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_769x769_40k_cityscapes_20200530_045348-8fe9a9dc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_80k_cityscapes/nonlocal_r50-d8_512x1024_80k_cityscapes_20200607_193518-d6839fae.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_512x1024_80k_cityscapes_20200607_193518-d6839fae.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_80k_cityscapes/nonlocal_r101-d8_512x1024_80k_cityscapes_20200607_183411-32700183.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_512x1024_80k_cityscapes_20200607_183411-32700183.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_80k_cityscapes/nonlocal_r50-d8_769x769_80k_cityscapes_20200607_193506-1f9792f6.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_769x769_80k_cityscapes_20200607_193506-1f9792f6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_80k_cityscapes/nonlocal_r101-d8_769x769_80k_cityscapes_20200607_183428-0e1fa4f9.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_769x769_80k_cityscapes_20200607_183428-0e1fa4f9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_80k_ade20k/nonlocal_r50-d8_512x512_80k_ade20k_20200615_015801-5ae0aa33.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_512x512_80k_ade20k_20200615_015801-5ae0aa33.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_80k_ade20k/nonlocal_r101-d8_512x512_80k_ade20k_20200615_015758-24105919.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_512x512_80k_ade20k_20200615_015758-24105919.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_160k_ade20k/nonlocal_r50-d8_512x512_160k_ade20k_20200616_005410-baef45e3.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_512x512_160k_ade20k_20200616_005410-baef45e3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_160k_ade20k/nonlocal_r101-d8_512x512_160k_ade20k_20210827_221502-7881aa1a.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_512x512_160k_ade20k_20210827_221502-7881aa1a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_20k_voc12aug/nonlocal_r50-d8_512x512_20k_voc12aug_20200617_222613-07f2a57c.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_512x512_20k_voc12aug_20200617_222613-07f2a57c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_20k_voc12aug/nonlocal_r101-d8_512x512_20k_voc12aug_20200617_222615-948c68ab.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_512x512_20k_voc12aug_20200617_222615-948c68ab.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_40k_voc12aug/nonlocal_r50-d8_512x512_40k_voc12aug_20200614_000028-0139d4a9.pt | nonlocal_net模型在开源社区上的nonlocal_r50-d8_512x512_40k_voc12aug_20200614_000028-0139d4a9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/nonlocal_net/README.md|MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_40k_voc12aug/nonlocal_r101-d8_512x512_40k_voc12aug_20200614_000028-7e5ff470.pt | nonlocal_net模型在开源社区上的nonlocal_r101-d8_512x512_40k_voc12aug_20200614_000028-7e5ff470.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://arxiv.org/abs/1909.1106 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/ocr_head.py#L8 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pidnet/pidnet-s_2xb6-120k_1024x1024-cityscapes.py|MMseg-swin/configs/ocrnet/ocrnet.yml | https://github.com/openseg-group/OCNet.pytorc | 源码下载链接| -| 开发引入 | / |MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_40k_cityscapes/ocrnet_hr18s_512x1024_40k_cityscapes_20200601_033304-fa2436c2.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x1024_40k_cityscapes_20200601_033304-fa2436c2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_40k_cityscapes/ocrnet_hr18_512x1024_40k_cityscapes_20200601_033320-401c5bdd.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x1024_40k_cityscapes_20200601_033320-401c5bdd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_40k_cityscapes/ocrnet_hr48_512x1024_40k_cityscapes_20200601_033336-55b32491.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x1024_40k_cityscapes_20200601_033336-55b32491.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_80k_cityscapes/ocrnet_hr18s_512x1024_80k_cityscapes_20200601_222735-55979e63.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x1024_80k_cityscapes_20200601_222735-55979e63.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_80k_cityscapes/ocrnet_hr18_512x1024_80k_cityscapes_20200614_230521-c2e1dd4a.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x1024_80k_cityscapes_20200614_230521-c2e1dd4a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_80k_cityscapes/ocrnet_hr48_512x1024_80k_cityscapes_20200601_222752-9076bcdf.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x1024_80k_cityscapes_20200601_222752-9076bcdf.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_160k_cityscapes/ocrnet_hr18s_512x1024_160k_cityscapes_20200602_191005-f4a7af28.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x1024_160k_cityscapes_20200602_191005-f4a7af28.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_160k_cityscapes/ocrnet_hr18_512x1024_160k_cityscapes_20200602_191001-b9172d0c.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x1024_160k_cityscapes_20200602_191001-b9172d0c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_160k_cityscapes/ocrnet_hr48_512x1024_160k_cityscapes_20200602_191037-dfbf1b0c.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x1024_160k_cityscapes_20200602_191037-dfbf1b0c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b8_cityscapes/ocrnet_r101-d8_512x1024_40k_b8_cityscapes_20200717_110721-02ac0f13.pt | ocrnet模型在开源社区上的ocrnet_r101-d8_512x1024_40k_b8_cityscapes_20200717_110721-02ac0f13.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b16_cityscapes/ocrnet_r101-d8_512x1024_40k_b16_cityscapes_20200723_193726-db500f80.pt | ocrnet模型在开源社区上的ocrnet_r101-d8_512x1024_40k_b16_cityscapes_20200723_193726-db500f80.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_80k_b16_cityscapes/ocrnet_r101-d8_512x1024_80k_b16_cityscapes_20200723_192421-78688424.pt | ocrnet模型在开源社区上的ocrnet_r101-d8_512x1024_80k_b16_cityscapes_20200723_192421-78688424.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_80k_ade20k/ocrnet_hr18s_512x512_80k_ade20k_20200615_055600-e80b62af.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x512_80k_ade20k_20200615_055600-e80b62af.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_80k_ade20k/ocrnet_hr18_512x512_80k_ade20k_20200615_053157-d173d83b.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x512_80k_ade20k_20200615_053157-d173d83b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_80k_ade20k/ocrnet_hr48_512x512_80k_ade20k_20200615_021518-d168c2d1.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x512_80k_ade20k_20200615_021518-d168c2d1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_160k_ade20k/ocrnet_hr18s_512x512_160k_ade20k_20200615_184505-8e913058.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x512_160k_ade20k_20200615_184505-8e913058.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_160k_ade20k/ocrnet_hr18_512x512_160k_ade20k_20200615_200940-d8fcd9d1.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x512_160k_ade20k_20200615_200940-d8fcd9d1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_160k_ade20k/ocrnet_hr48_512x512_160k_ade20k_20200615_184705-a073726d.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x512_160k_ade20k_20200615_184705-a073726d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_20k_voc12aug/ocrnet_hr18s_512x512_20k_voc12aug_20200617_233913-02b04fcb.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x512_20k_voc12aug_20200617_233913-02b04fcb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_20k_voc12aug/ocrnet_hr18_512x512_20k_voc12aug_20200617_233932-8954cbb7.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x512_20k_voc12aug_20200617_233932-8954cbb7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_20k_voc12aug/ocrnet_hr48_512x512_20k_voc12aug_20200617_233932-9e82080a.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x512_20k_voc12aug_20200617_233932-9e82080a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_40k_voc12aug/ocrnet_hr18s_512x512_40k_voc12aug_20200614_002025-42b587ac.pt | ocrnet模型在开源社区上的ocrnet_hr18s_512x512_40k_voc12aug_20200614_002025-42b587ac.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_40k_voc12aug/ocrnet_hr18_512x512_40k_voc12aug_20200614_015958-714302be.pt | ocrnet模型在开源社区上的ocrnet_hr18_512x512_40k_voc12aug_20200614_015958-714302be.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/ocrnet/README.md|MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_40k_voc12aug/ocrnet_hr48_512x512_40k_voc12aug_20200614_015958-255bc5ce.pt | ocrnet模型在开源社区上的ocrnet_hr48_512x512_40k_voc12aug_20200614_015958-255bc5ce.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://arxiv.org/abs/1912.0819 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/point_head.py#L3 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRen | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x1024_80k_cityscapes/pointrend_r50_512x1024_80k_cityscapes_20200711_015821-bb1ff523.pt | point_rend模型在开源社区上的pointrend_r50_512x1024_80k_cityscapes_20200711_015821-bb1ff523.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x1024_80k_cityscapes/pointrend_r101_512x1024_80k_cityscapes_20200711_170850-d0ca84be.pt | point_rend模型在开源社区上的pointrend_r101_512x1024_80k_cityscapes_20200711_170850-d0ca84be.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x512_160k_ade20k/pointrend_r50_512x512_160k_ade20k_20200807_232644-ac3febf2.pt | point_rend模型在开源社区上的pointrend_r50_512x512_160k_ade20k_20200807_232644-ac3febf2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/point_rend/README.md|MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x512_160k_ade20k/pointrend_r101_512x512_160k_ade20k_20200808_030852-8834902a.pt | point_rend模型在开源社区上的pointrend_r101_512x512_160k_ade20k_20200808_030852-8834902a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://openaccess.thecvf.com/content_ECCV_2018/papers/Hengshuang_Zhao_PSANet_Point-wise_Spatial_ECCV_2018_paper.pd | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/psa_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://github.com/hszhao/PSANe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_40k_cityscapes/psanet_r50-d8_512x1024_40k_cityscapes_20200606_103117-99fac37c.pt | psanet模型在开源社区上的psanet_r50-d8_512x1024_40k_cityscapes_20200606_103117-99fac37c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_40k_cityscapes/psanet_r101-d8_512x1024_40k_cityscapes_20200606_001418-27b9cfa7.pt | psanet模型在开源社区上的psanet_r101-d8_512x1024_40k_cityscapes_20200606_001418-27b9cfa7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_40k_cityscapes/psanet_r50-d8_769x769_40k_cityscapes_20200530_033717-d5365506.pt | psanet模型在开源社区上的psanet_r50-d8_769x769_40k_cityscapes_20200530_033717-d5365506.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_40k_cityscapes/psanet_r101-d8_769x769_40k_cityscapes_20200530_035107-997da1e6.pt | psanet模型在开源社区上的psanet_r101-d8_769x769_40k_cityscapes_20200530_035107-997da1e6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_80k_cityscapes/psanet_r50-d8_512x1024_80k_cityscapes_20200606_161842-ab60a24f.pt | psanet模型在开源社区上的psanet_r50-d8_512x1024_80k_cityscapes_20200606_161842-ab60a24f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_80k_cityscapes/psanet_r101-d8_512x1024_80k_cityscapes_20200606_161823-0f73a169.pt | psanet模型在开源社区上的psanet_r101-d8_512x1024_80k_cityscapes_20200606_161823-0f73a169.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_80k_cityscapes/psanet_r50-d8_769x769_80k_cityscapes_20200606_225134-fe42f49e.pt | psanet模型在开源社区上的psanet_r50-d8_769x769_80k_cityscapes_20200606_225134-fe42f49e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_80k_cityscapes/psanet_r101-d8_769x769_80k_cityscapes_20200606_214550-7665827b.pt | psanet模型在开源社区上的psanet_r101-d8_769x769_80k_cityscapes_20200606_214550-7665827b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_80k_ade20k/psanet_r50-d8_512x512_80k_ade20k_20200614_144141-835e4b97.pt | psanet模型在开源社区上的psanet_r50-d8_512x512_80k_ade20k_20200614_144141-835e4b97.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_80k_ade20k/psanet_r101-d8_512x512_80k_ade20k_20200614_185117-1fab60d4.pt | psanet模型在开源社区上的psanet_r101-d8_512x512_80k_ade20k_20200614_185117-1fab60d4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_160k_ade20k/psanet_r50-d8_512x512_160k_ade20k_20200615_161258-148077dd.pt | psanet模型在开源社区上的psanet_r50-d8_512x512_160k_ade20k_20200615_161258-148077dd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_160k_ade20k/psanet_r101-d8_512x512_160k_ade20k_20200615_161537-dbfa564c.pt | psanet模型在开源社区上的psanet_r101-d8_512x512_160k_ade20k_20200615_161537-dbfa564c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_20k_voc12aug/psanet_r50-d8_512x512_20k_voc12aug_20200617_102413-2f1bbaa1.pt | psanet模型在开源社区上的psanet_r50-d8_512x512_20k_voc12aug_20200617_102413-2f1bbaa1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_20k_voc12aug/psanet_r101-d8_512x512_20k_voc12aug_20200617_110624-946fef11.pt | psanet模型在开源社区上的psanet_r101-d8_512x512_20k_voc12aug_20200617_110624-946fef11.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_40k_voc12aug/psanet_r50-d8_512x512_40k_voc12aug_20200613_161946-f596afb5.pt | psanet模型在开源社区上的psanet_r50-d8_512x512_40k_voc12aug_20200613_161946-f596afb5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/psanet/README.md|MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_40k_voc12aug/psanet_r101-d8_512x512_40k_voc12aug_20200613_161946-1f560f9e.pt | psanet模型在开源社区上的psanet_r101-d8_512x512_40k_voc12aug_20200613_161946-1f560f9e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/advanced_guides/add_models.md|MMseg-swin/configs/pspnet/pspnet.yml | https://arxiv.org/abs/1612.0110 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/psp_head.py#L6 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://github.com/hszhao/PSPNe | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/demo/inference_demo.ipynb|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_40k_cityscapes/pspnet_r50-d8_512x1024_40k_cityscapes_20200605_003338-2966598c.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x1024_40k_cityscapes_20200605_003338-2966598c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_40k_cityscapes/pspnet_r101-d8_512x1024_40k_cityscapes_20200604_232751-467e7cf4.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x1024_40k_cityscapes_20200604_232751-467e7cf4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_40k_cityscapes/pspnet_r50-d8_769x769_40k_cityscapes_20200606_112725-86638686.pt | pspnet模型在开源社区上的pspnet_r50-d8_769x769_40k_cityscapes_20200606_112725-86638686.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_40k_cityscapes/pspnet_r101-d8_769x769_40k_cityscapes_20200606_112753-61c6f5be.pt | pspnet模型在开源社区上的pspnet_r101-d8_769x769_40k_cityscapes_20200606_112753-61c6f5be.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x1024_80k_cityscapes/pspnet_r18-d8_512x1024_80k_cityscapes_20201225_021458-09ffa746.pt | pspnet模型在开源社区上的pspnet_r18-d8_512x1024_80k_cityscapes_20201225_021458-09ffa746.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_80k_cityscapes/pspnet_r50-d8_512x1024_80k_cityscapes_20200606_112131-2376f12b.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x1024_80k_cityscapes_20200606_112131-2376f12b.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220315_123238-588c30be.pt | pspnet模型在开源社区上的pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220315_123238-588c30be.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_80k_cityscapes/pspnet_r101-d8_512x1024_80k_cityscapes_20200606_112211-e1e1100f.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x1024_80k_cityscapes_20200606_112211-e1e1100f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_fp16_512x1024_80k_cityscapes/pspnet_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230919-a0875e5c.pt | pspnet模型在开源社区上的pspnet_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230919-a0875e5c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_769x769_80k_cityscapes/pspnet_r18-d8_769x769_80k_cityscapes_20201225_021458-3deefc62.pt | pspnet模型在开源社区上的pspnet_r18-d8_769x769_80k_cityscapes_20201225_021458-3deefc62.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_80k_cityscapes/pspnet_r50-d8_769x769_80k_cityscapes_20200606_210121-5ccf03dd.pt | pspnet模型在开源社区上的pspnet_r50-d8_769x769_80k_cityscapes_20200606_210121-5ccf03dd.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_80k_cityscapes/pspnet_r101-d8_769x769_80k_cityscapes_20200606_225055-dba412fa.pt | pspnet模型在开源社区上的pspnet_r101-d8_769x769_80k_cityscapes_20200606_225055-dba412fa.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_512x1024_80k_cityscapes/pspnet_r18b-d8_512x1024_80k_cityscapes_20201226_063116-26928a60.pt | pspnet模型在开源社区上的pspnet_r18b-d8_512x1024_80k_cityscapes_20201226_063116-26928a60.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_512x1024_80k_cityscapes/pspnet_r50b-d8_512x1024_80k_cityscapes_20201225_094315-6344287a.pt | pspnet模型在开源社区上的pspnet_r50b-d8_512x1024_80k_cityscapes_20201225_094315-6344287a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_512x1024_80k_cityscapes/pspnet_r101b-d8_512x1024_80k_cityscapes_20201226_170012-3a4d38ab.pt | pspnet模型在开源社区上的pspnet_r101b-d8_512x1024_80k_cityscapes_20201226_170012-3a4d38ab.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_769x769_80k_cityscapes/pspnet_r18b-d8_769x769_80k_cityscapes_20201226_080942-bf98d186.pt | pspnet模型在开源社区上的pspnet_r18b-d8_769x769_80k_cityscapes_20201226_080942-bf98d186.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_769x769_80k_cityscapes/pspnet_r50b-d8_769x769_80k_cityscapes_20201225_094316-4c643cf6.pt | pspnet模型在开源社区上的pspnet_r50b-d8_769x769_80k_cityscapes_20201225_094316-4c643cf6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_769x769_80k_cityscapes/pspnet_r101b-d8_769x769_80k_cityscapes_20201226_171823-f0e7c293.pt | pspnet模型在开源社区上的pspnet_r101b-d8_769x769_80k_cityscapes_20201226_171823-f0e7c293.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d32_512x1024_80k_cityscapes/pspnet_r50-d32_512x1024_80k_cityscapes_20220316_224840-9092b254.pt | pspnet模型在开源社区上的pspnet_r50-d32_512x1024_80k_cityscapes_20220316_224840-9092b254.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220316_141229-dd9c9610.pt | pspnet模型在开源社区上的pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220316_141229-dd9c9610.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d32_512x1024_80k_cityscapes/pspnet_r50b-d32_512x1024_80k_cityscapes_20220311_152152-23bcaf8c.pt | pspnet模型在开源社区上的pspnet_r50b-d32_512x1024_80k_cityscapes_20220311_152152-23bcaf8c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_ade20k/pspnet_r50-d8_512x512_80k_ade20k_20200615_014128-15a8b914.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_80k_ade20k_20200615_014128-15a8b914.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_ade20k/pspnet_r101-d8_512x512_80k_ade20k_20200614_031423-b6e782f0.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_80k_ade20k_20200614_031423-b6e782f0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_160k_ade20k/pspnet_r50-d8_512x512_160k_ade20k_20200615_184358-1890b0bd.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_160k_ade20k_20200615_184358-1890b0bd.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_160k_ade20k/pspnet_r101-d8_512x512_160k_ade20k_20200615_100650-967c316f.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_160k_ade20k_20200615_100650-967c316f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_20k_voc12aug/pspnet_r50-d8_512x512_20k_voc12aug_20200617_101958-ed5dfbd9.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_20k_voc12aug_20200617_101958-ed5dfbd9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_20k_voc12aug/pspnet_r101-d8_512x512_20k_voc12aug_20200617_102003-4aef3c9a.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_20k_voc12aug_20200617_102003-4aef3c9a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_40k_voc12aug/pspnet_r50-d8_512x512_40k_voc12aug_20200613_161222-ae9c1b8c.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_40k_voc12aug_20200613_161222-ae9c1b8c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_40k_voc12aug/pspnet_r101-d8_512x512_40k_voc12aug_20200613_161222-bc933b18.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_40k_voc12aug_20200613_161222-bc933b18.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context/pspnet_r101-d8_480x480_40k_pascal_context_20200911_211210-bf0f5d7c.pt | pspnet模型在开源社区上的pspnet_r101-d8_480x480_40k_pascal_context_20200911_211210-bf0f5d7c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context/pspnet_r101-d8_480x480_80k_pascal_context_20200911_190530-c86d6233.pt | pspnet模型在开源社区上的pspnet_r101-d8_480x480_80k_pascal_context_20200911_190530-c86d6233.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context_59/pspnet_r101-d8_480x480_40k_pascal_context_59_20210416_114524-86d44cd4.pt | pspnet模型在开源社区上的pspnet_r101-d8_480x480_40k_pascal_context_59_20210416_114524-86d44cd4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context_59/pspnet_r101-d8_480x480_80k_pascal_context_59_20210416_114418-fa6caaa2.pt | pspnet模型在开源社区上的pspnet_r101-d8_480x480_80k_pascal_context_59_20210416_114418-fa6caaa2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k/pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k_20210820_203258-b88df27f.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k_20210820_203258-b88df27f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k/pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k_20210820_232135-76aae482.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k_20210820_232135-76aae482.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k/pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_030857-92e2902b.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_030857-92e2902b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k/pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_014022-831aec95.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_014022-831aec95.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-0e41b2db.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-0e41b2db.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-7eb41789.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-7eb41789.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-51276a57.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-51276a57.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-4af9621b.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-4af9621b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-be9610cc.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-be9610cc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-72220c60.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-72220c60.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x512_80k_loveda/pspnet_r18-d8_512x512_80k_loveda_20211105_052100-b97697f1.pt | pspnet模型在开源社区上的pspnet_r18-d8_512x512_80k_loveda_20211105_052100-b97697f1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_loveda/pspnet_r50-d8_512x512_80k_loveda_20211104_155728-88610f9f.pt | pspnet模型在开源社区上的pspnet_r50-d8_512x512_80k_loveda_20211104_155728-88610f9f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_loveda/pspnet_r101-d8_512x512_80k_loveda_20211104_153212-1c06c6a8.pt | pspnet模型在开源社区上的pspnet_r101-d8_512x512_80k_loveda_20211104_153212-1c06c6a8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_512x512_80k_potsdam/pspnet_r18-d8_4x4_512x512_80k_potsdam_20211220_125612-7cd046e1.pt | pspnet模型在开源社区上的pspnet_r18-d8_4x4_512x512_80k_potsdam_20211220_125612-7cd046e1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_512x512_80k_potsdam/pspnet_r50-d8_4x4_512x512_80k_potsdam_20211219_043541-2dd5fe67.pt | pspnet模型在开源社区上的pspnet_r50-d8_4x4_512x512_80k_potsdam_20211219_043541-2dd5fe67.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_4x4_512x512_80k_potsdam/pspnet_r101-d8_4x4_512x512_80k_potsdam_20211220_125612-aed036c4.pt | pspnet模型在开源社区上的pspnet_r101-d8_4x4_512x512_80k_potsdam_20211220_125612-aed036c4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_512x512_80k_vaihingen/pspnet_r18-d8_4x4_512x512_80k_vaihingen_20211228_160355-52a8a6f6.pt | pspnet模型在开源社区上的pspnet_r18-d8_4x4_512x512_80k_vaihingen_20211228_160355-52a8a6f6.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_512x512_80k_vaihingen/pspnet_r50-d8_4x4_512x512_80k_vaihingen_20211228_160355-382f8f5b.pt | pspnet模型在开源社区上的pspnet_r50-d8_4x4_512x512_80k_vaihingen_20211228_160355-382f8f5b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_4x4_512x512_80k_vaihingen/pspnet_r101-d8_4x4_512x512_80k_vaihingen_20211231_230806-8eba0a09.pt | pspnet模型在开源社区上的pspnet_r101-d8_4x4_512x512_80k_vaihingen_20211231_230806-8eba0a09.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_896x896_80k_isaid/pspnet_r18-d8_4x4_896x896_80k_isaid_20220110_180526-e84c0b6a.pt | pspnet模型在开源社区上的pspnet_r18-d8_4x4_896x896_80k_isaid_20220110_180526-e84c0b6a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/pspnet/README.md|MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_896x896_80k_isaid/pspnet_r50-d8_4x4_896x896_80k_isaid_20220110_180629-1f21dc32.pt | pspnet模型在开源社区上的pspnet_r50-d8_4x4_896x896_80k_isaid_20220110_180629-1f21dc32.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x1024_80k_cityscapes/fcn_s101-d8_512x1024_80k_cityscapes_20200807_140631-f8d155b3.pt | resnest模型在开源社区上的fcn_s101-d8_512x1024_80k_cityscapes_20200807_140631-f8d155b3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x1024_80k_cityscapes/pspnet_s101-d8_512x1024_80k_cityscapes_20200807_140631-c75f3b99.pt | resnest模型在开源社区上的pspnet_s101-d8_512x1024_80k_cityscapes_20200807_140631-c75f3b99.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x1024_80k_cityscapes/deeplabv3_s101-d8_512x1024_80k_cityscapes_20200807_144429-b73c4270.pt | resnest模型在开源社区上的deeplabv3_s101-d8_512x1024_80k_cityscapes_20200807_144429-b73c4270.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x1024_80k_cityscapes/deeplabv3plus_s101-d8_512x1024_80k_cityscapes_20200807_144429-1239eb43.pt | resnest模型在开源社区上的deeplabv3plus_s101-d8_512x1024_80k_cityscapes_20200807_144429-1239eb43.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x512_160k_ade20k/fcn_s101-d8_512x512_160k_ade20k_20200807_145416-d3160329.pt | resnest模型在开源社区上的fcn_s101-d8_512x512_160k_ade20k_20200807_145416-d3160329.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x512_160k_ade20k/pspnet_s101-d8_512x512_160k_ade20k_20200807_145416-a6daa92a.pt | resnest模型在开源社区上的pspnet_s101-d8_512x512_160k_ade20k_20200807_145416-a6daa92a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x512_160k_ade20k/deeplabv3_s101-d8_512x512_160k_ade20k_20200807_144503-17ecabe5.pt | resnest模型在开源社区上的deeplabv3_s101-d8_512x512_160k_ade20k_20200807_144503-17ecabe5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/resnest/README.md|MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x512_160k_ade20k/deeplabv3plus_s101-d8_512x512_160k_ade20k_20200807_144503-27b26226.pt | resnest模型在开源社区上的deeplabv3plus_s101-d8_512x512_160k_ade20k_20200807_144503-27b26226.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://arxiv.org/abs/2105.1520 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/backbones/mit.py#L24 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://github.com/NVlabs/SegForme | 源码下载链接| -| 开发引入 | / |MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b0_512x512_160k_ade20k/segformer_mit-b0_512x512_160k_ade20k_20220617_162207-c00b9603.pt | segformer模型在开源社区上的segformer_mit-b0_512x512_160k_ade20k_20220617_162207-c00b9603.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b1_512x512_160k_ade20k/segformer_mit-b1_512x512_160k_ade20k_20220620_112037-c3f39e00.pt | segformer模型在开源社区上的segformer_mit-b1_512x512_160k_ade20k_20220620_112037-c3f39e00.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b2_512x512_160k_ade20k/segformer_mit-b2_512x512_160k_ade20k_20220620_114047-64e4feca.pt | segformer模型在开源社区上的segformer_mit-b2_512x512_160k_ade20k_20220620_114047-64e4feca.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b3_512x512_160k_ade20k/segformer_mit-b3_512x512_160k_ade20k_20220617_162254-3a4b7363.pt | segformer模型在开源社区上的segformer_mit-b3_512x512_160k_ade20k_20220617_162254-3a4b7363.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b4_512x512_160k_ade20k/segformer_mit-b4_512x512_160k_ade20k_20220620_112216-4fa4f58f.pt | segformer模型在开源社区上的segformer_mit-b4_512x512_160k_ade20k_20220620_112216-4fa4f58f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_512x512_160k_ade20k/segformer_mit-b5_512x512_160k_ade20k_20210726_145235-94cedf59.pt | segformer模型在开源社区上的segformer_mit-b5_512x512_160k_ade20k_20210726_145235-94cedf59.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_640x640_160k_ade20k/segformer_mit-b5_640x640_160k_ade20k_20220617_203542-940a6bd8.pt | segformer模型在开源社区上的segformer_mit-b5_640x640_160k_ade20k_20220617_203542-940a6bd8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes/segformer_mit-b0_8x1_1024x1024_160k_cityscapes_20211208_101857-e7f88502.pt | segformer模型在开源社区上的segformer_mit-b0_8x1_1024x1024_160k_cityscapes_20211208_101857-e7f88502.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes/segformer_mit-b1_8x1_1024x1024_160k_cityscapes_20211208_064213-655c7b3f.pt | segformer模型在开源社区上的segformer_mit-b1_8x1_1024x1024_160k_cityscapes_20211208_064213-655c7b3f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes/segformer_mit-b2_8x1_1024x1024_160k_cityscapes_20211207_134205-6096669a.pt | segformer模型在开源社区上的segformer_mit-b2_8x1_1024x1024_160k_cityscapes_20211207_134205-6096669a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes/segformer_mit-b3_8x1_1024x1024_160k_cityscapes_20211206_224823-a8f8a177.pt | segformer模型在开源社区上的segformer_mit-b3_8x1_1024x1024_160k_cityscapes_20211206_224823-a8f8a177.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes/segformer_mit-b4_8x1_1024x1024_160k_cityscapes_20211207_080709-07f6c333.pt | segformer模型在开源社区上的segformer_mit-b4_8x1_1024x1024_160k_cityscapes_20211207_080709-07f6c333.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segformer/README.md|MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes/segformer_mit-b5_8x1_1024x1024_160k_cityscapes_20211206_072934-87a052ec.pt | segformer模型在开源社区上的segformer_mit-b5_8x1_1024x1024_160k_cityscapes_20211206_072934-87a052ec.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://arxiv.org/abs/2105.0563 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.21.0/mmseg/models/decode_heads/segmenter_mask_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://github.com/rstrudel/segmente | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k/segmenter_vit-t_mask_8x1_512x512_160k_ade20k_20220105_151706-ffcf7509.pt | segmenter模型在开源社区上的segmenter_vit-t_mask_8x1_512x512_160k_ade20k_20220105_151706-ffcf7509.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-s_linear_8x1_512x512_160k_ade20k/segmenter_vit-s_linear_8x1_512x512_160k_ade20k_20220105_151713-39658c46.pt | segmenter模型在开源社区上的segmenter_vit-s_linear_8x1_512x512_160k_ade20k_20220105_151713-39658c46.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k/segmenter_vit-s_mask_8x1_512x512_160k_ade20k_20220105_151706-511bb103.pt | segmenter模型在开源社区上的segmenter_vit-s_mask_8x1_512x512_160k_ade20k_20220105_151706-511bb103.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/segmenter/README.md|MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-b_mask_8x1_512x512_160k_ade20k/segmenter_vit-b_mask_8x1_512x512_160k_ade20k_20220105_151706-bc533b08.pt | segmenter模型在开源社区上的segmenter_vit-b_mask_8x1_512x512_160k_ade20k_20220105_151706-bc533b08.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k/segmenter_vit-l_mask_8x1_640x640_160k_ade20k_20220614_024513-4783a347.pt | segmenter模型在开源社区上的segmenter_vit-l_mask_8x1_640x640_160k_ade20k_20220614_024513-4783a347.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/sem_fpn/README.md|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://arxiv.org/abs/1901.0244 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/sem_fpn/README.md|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/fpn_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/projects/sam_inference_demo/sam/modeling/common.py|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://github.com/facebookresearch/detectron | github.com模型在开源社区上的detectron的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/sem_fpn/README.md|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x1024_80k_cityscapes/fpn_r50_512x1024_80k_cityscapes_20200717_021437-94018a0d.pt | sem_fpn模型在开源社区上的fpn_r50_512x1024_80k_cityscapes_20200717_021437-94018a0d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/sem_fpn/README.md|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x1024_80k_cityscapes/fpn_r101_512x1024_80k_cityscapes_20200717_012416-c5800d4c.pt | sem_fpn模型在开源社区上的fpn_r101_512x1024_80k_cityscapes_20200717_012416-c5800d4c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/sem_fpn/README.md|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x512_160k_ade20k/fpn_r50_512x512_160k_ade20k_20200718_131734-5b5a6ab9.pt | sem_fpn模型在开源社区上的fpn_r50_512x512_160k_ade20k_20200718_131734-5b5a6ab9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/sem_fpn/README.md|MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x512_160k_ade20k/fpn_r101_512x512_160k_ade20k_20200718_131734-306b5004.pt | sem_fpn模型在开源社区上的fpn_r101_512x512_160k_ade20k_20200718_131734-306b5004.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://arxiv.org/abs/2012.1584 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/setr_up_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://github.com/fudan-zvg/SET | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_naive_512x512_160k_b16_ade20k/setr_naive_512x512_160k_b16_ade20k_20210619_191258-061f24f5.pt | setr模型在开源社区上的setr_naive_512x512_160k_b16_ade20k_20210619_191258-061f24f5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_pup_512x512_160k_b16_ade20k/setr_pup_512x512_160k_b16_ade20k_20210619_191343-7e0ce826.pt | setr模型在开源社区上的setr_pup_512x512_160k_b16_ade20k_20210619_191343-7e0ce826.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_512x512_160k_b8_ade20k/setr_mla_512x512_160k_b8_ade20k_20210619_191118-c6d21df0.pt | setr模型在开源社区上的setr_mla_512x512_160k_b8_ade20k_20210619_191118-c6d21df0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_512x512_160k_b16_ade20k/setr_mla_512x512_160k_b16_ade20k_20210619_191057-f9741de7.pt | setr模型在开源社区上的setr_mla_512x512_160k_b16_ade20k_20210619_191057-f9741de7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_naive_vit-large_8x1_768x768_80k_cityscapes/setr_naive_vit-large_8x1_768x768_80k_cityscapes_20211123_000505-20728e80.pt | setr模型在开源社区上的setr_naive_vit-large_8x1_768x768_80k_cityscapes_20211123_000505-20728e80.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_pup_vit-large_8x1_768x768_80k_cityscapes/setr_pup_vit-large_8x1_768x768_80k_cityscapes_20211122_155115-f6f37b8f.pt | setr模型在开源社区上的setr_pup_vit-large_8x1_768x768_80k_cityscapes_20211122_155115-f6f37b8f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/setr/README.md|MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_vit-large_8x1_768x768_80k_cityscapes/setr_mla_vit-large_8x1_768x768_80k_cityscapes_20211119_101003-7f8dccbe.pt | setr模型在开源社区上的setr_mla_vit-large_8x1_768x768_80k_cityscapes_20211119_101003-7f8dccbe.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://arxiv.org/abs/2104.1318 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.20.0/mmseg/models/backbones/stdc.py#L39 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://github.com/MichaelFan01/STDC-Se | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc1_512x1024_80k_cityscapes/stdc1_512x1024_80k_cityscapes_20220224_073048-74e6920a.pt | stdc模型在开源社区上的stdc1_512x1024_80k_cityscapes_20220224_073048-74e6920a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes/stdc1_in1k-pre_512x1024_80k_cityscapes_20220224_141648-3d4c2981.pt | stdc模型在开源社区上的stdc1_in1k-pre_512x1024_80k_cityscapes_20220224_141648-3d4c2981.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc2_512x1024_80k_cityscapes/stdc2_512x1024_80k_cityscapes_20220222_132015-fb1e3a1a.pt | stdc模型在开源社区上的stdc2_512x1024_80k_cityscapes_20220222_132015-fb1e3a1a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/stdc/README.md|MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes/stdc2_in1k-pre_512x1024_80k_cityscapes_20220224_073048-1f8f0f6c.pt | stdc模型在开源社区上的stdc2_in1k-pre_512x1024_80k_cityscapes_20220224_073048-1f8f0f6c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/swin/README.md|MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210531_112542-e380ad3e.pt | swin模型在开源社区上的upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210531_112542-e380ad3e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/swin/README.md|MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192015-ee2fff1c.pt | swin模型在开源社区上的upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192015-ee2fff1c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/swin/README.md|MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192340-593b0e13.pt | swin模型在开源社区上的upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192340-593b0e13.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/swin/README.md|MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K_20210526_211650-762e2178.pt | swin模型在开源社区上的upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K_20210526_211650-762e2178.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/swin/README.md|MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K_20210531_132020-05b22ea4.pt | swin模型在开源社区上的upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K_20210531_132020-05b22ea4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/swin/README.md|MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K_20210531_125459-429057bf.pt | swin模型在开源社区上的upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K_20210531_125459-429057bf.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_204132-41acd132.pt | twins模型在开源社区上的twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_204132-41acd132.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k/twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k_20211201_233537-8e99c07a.pt | twins模型在开源社区上的twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k_20211201_233537-8e99c07a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141019-d396db72.pt | twins模型在开源社区上的twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141019-d396db72.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k_20211130_141020-02094ea5.pt | twins模型在开源社区上的twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k_20211130_141020-02094ea5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_105226-bc6d61dc.pt | twins模型在开源社区上的twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_105226-bc6d61dc.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k_20211201_075053-c6095c07.pt | twins模型在开源社区上的twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k_20211201_075053-c6095c07.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141006-0a0d3317.pt | twins模型在开源社区上的twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141006-0a0d3317.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k/twins_svt-s_uperhead_8x2_512x512_160k_ade20k_20211130_141005-e48a2d94.pt | twins模型在开源社区上的twins_svt-s_uperhead_8x2_512x512_160k_ade20k_20211130_141005-e48a2d94.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_113849-88b2907c.pt | twins模型在开源社区上的twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_113849-88b2907c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k/twins_svt-b_uperhead_8x2_512x512_160k_ade20k_20211202_040826-0943a1f1.pt | twins模型在开源社区上的twins_svt-b_uperhead_8x2_512x512_160k_ade20k_20211202_040826-0943a1f1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141005-1d59bee2.pt | twins模型在开源社区上的twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141005-1d59bee2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/twins/README.md|MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k/twins_svt-l_uperhead_8x2_512x512_160k_ade20k_20211130_141005-3e2cae61.pt | twins模型在开源社区上的twins_svt-l_uperhead_8x2_512x512_160k_ade20k_20211130_141005-3e2cae61.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://arxiv.org/abs/1505.0459 | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/backbones/unet.py#L22 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | http://lmb.informatik.uni-freiburg.de/people/ronneber/u-ne | 源码下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes_20211210_145204-6860854e.pt | unet模型在开源社区上的fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes_20211210_145204-6860854e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_64x64_40k_drive/fcn_unet_s5-d16_64x64_40k_drive_20201223_191051-5daf6d3b.pt | unet模型在开源社区上的fcn_unet_s5-d16_64x64_40k_drive_20201223_191051-5daf6d3b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201820-785de5c2.pt | unet模型在开源社区上的fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201820-785de5c2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_64x64_40k_drive/pspnet_unet_s5-d16_64x64_40k_drive_20201227_181818-aac73387.pt | unet模型在开源社区上的pspnet_unet_s5-d16_64x64_40k_drive_20201227_181818-aac73387.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201821-22b3e3ba.pt | unet模型在开源社区上的pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201821-22b3e3ba.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_64x64_40k_drive/deeplabv3_unet_s5-d16_64x64_40k_drive_20201226_094047-0671ff20.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_64x64_40k_drive_20201226_094047-0671ff20.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201825-6bf0efd7.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201825-6bf0efd7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_stare/fcn_unet_s5-d16_128x128_40k_stare_20201223_191051-7d77e78b.pt | unet模型在开源社区上的fcn_unet_s5-d16_128x128_40k_stare_20201223_191051-7d77e78b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201821-f75705a9.pt | unet模型在开源社区上的fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201821-f75705a9.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_stare/pspnet_unet_s5-d16_128x128_40k_stare_20201227_181818-3c2923c4.pt | unet模型在开源社区上的pspnet_unet_s5-d16_128x128_40k_stare_20201227_181818-3c2923c4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201823-f1063ef7.pt | unet模型在开源社区上的pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201823-f1063ef7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_stare/deeplabv3_unet_s5-d16_128x128_40k_stare_20201226_094047-93dcb93c.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_128x128_40k_stare_20201226_094047-93dcb93c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201825-21db614c.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201825-21db614c.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_chase_db1/fcn_unet_s5-d16_128x128_40k_chase_db1_20201223_191051-11543527.pt | unet模型在开源社区上的fcn_unet_s5-d16_128x128_40k_chase_db1_20201223_191051-11543527.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201821-1c4eb7cf.pt | unet模型在开源社区上的fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201821-1c4eb7cf.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_chase_db1/pspnet_unet_s5-d16_128x128_40k_chase_db1_20201227_181818-68d4e609.pt | unet模型在开源社区上的pspnet_unet_s5-d16_128x128_40k_chase_db1_20201227_181818-68d4e609.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201823-c0802c4d.pt | unet模型在开源社区上的pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201823-c0802c4d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_chase_db1/deeplabv3_unet_s5-d16_128x128_40k_chase_db1_20201226_094047-4c5aefa3.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_128x128_40k_chase_db1_20201226_094047-4c5aefa3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201825-4ef29df5.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201825-4ef29df5.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_256x256_40k_hrf/fcn_unet_s5-d16_256x256_40k_hrf_20201223_173724-d89cf1ed.pt | unet模型在开源社区上的fcn_unet_s5-d16_256x256_40k_hrf_20201223_173724-d89cf1ed.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201821-c314da8a.pt | unet模型在开源社区上的fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201821-c314da8a.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_256x256_40k_hrf/pspnet_unet_s5-d16_256x256_40k_hrf_20201227_181818-fdb7e29b.pt | unet模型在开源社区上的pspnet_unet_s5-d16_256x256_40k_hrf_20201227_181818-fdb7e29b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201823-53d492fa.pt | unet模型在开源社区上的pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201823-53d492fa.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_256x256_40k_hrf/deeplabv3_unet_s5-d16_256x256_40k_hrf_20201226_094047-3a1fdf85.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_256x256_40k_hrf_20201226_094047-3a1fdf85.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/unet/README.md|MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_202032-59daf7a4.pt | unet模型在开源社区上的deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_202032-59daf7a4.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://arxiv.org/pdf/1807.10221.pd | 开源论文url链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://github.com/open-mmlab/mmsegmentation/blob/v0.17.0/mmseg/models/decode_heads/uper_head.py#L1 | 预训练模型下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://github.com/CSAILVision/unifiedparsin | 源码下载链接| -| 开发引入 | / |MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x1024_40k_cityscapes/upernet_r18_512x1024_40k_cityscapes_20220615_113231-12ee861d.pt | upernet模型在开源社区上的upernet_r18_512x1024_40k_cityscapes_20220615_113231-12ee861d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_40k_cityscapes/upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pt | upernet模型在开源社区上的upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_40k_cityscapes/upernet_r101_512x1024_40k_cityscapes_20200605_094933-ebce3b10.pt | upernet模型在开源社区上的upernet_r101_512x1024_40k_cityscapes_20200605_094933-ebce3b10.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_40k_cityscapes/upernet_r50_769x769_40k_cityscapes_20200530_033048-92d21539.pt | upernet模型在开源社区上的upernet_r50_769x769_40k_cityscapes_20200530_033048-92d21539.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_40k_cityscapes/upernet_r101_769x769_40k_cityscapes_20200530_040819-83c95d01.pt | upernet模型在开源社区上的upernet_r101_769x769_40k_cityscapes_20200530_040819-83c95d01.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x1024_80k_cityscapes/upernet_r18_512x1024_80k_cityscapes_20220614_110712-c89a9188.pt | upernet模型在开源社区上的upernet_r18_512x1024_80k_cityscapes_20220614_110712-c89a9188.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_80k_cityscapes/upernet_r50_512x1024_80k_cityscapes_20200607_052207-848beca8.pt | upernet模型在开源社区上的upernet_r50_512x1024_80k_cityscapes_20200607_052207-848beca8.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_80k_cityscapes/upernet_r101_512x1024_80k_cityscapes_20200607_002403-f05f2345.pt | upernet模型在开源社区上的upernet_r101_512x1024_80k_cityscapes_20200607_002403-f05f2345.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_80k_cityscapes/upernet_r50_769x769_80k_cityscapes_20200607_005107-82ae7d15.pt | upernet模型在开源社区上的upernet_r50_769x769_80k_cityscapes_20200607_005107-82ae7d15.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_80k_cityscapes/upernet_r101_769x769_80k_cityscapes_20200607_001014-082fc334.pt | upernet模型在开源社区上的upernet_r101_769x769_80k_cityscapes_20200607_001014-082fc334.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_80k_ade20k/upernet_r18_512x512_80k_ade20k_20220614_110319-22e81719.pt | upernet模型在开源社区上的upernet_r18_512x512_80k_ade20k_20220614_110319-22e81719.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_80k_ade20k/upernet_r50_512x512_80k_ade20k_20200614_144127-ecc8377b.pt | upernet模型在开源社区上的upernet_r50_512x512_80k_ade20k_20200614_144127-ecc8377b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_80k_ade20k/upernet_r101_512x512_80k_ade20k_20200614_185117-32e4db94.pt | upernet模型在开源社区上的upernet_r101_512x512_80k_ade20k_20200614_185117-32e4db94.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_160k_ade20k/upernet_r18_512x512_160k_ade20k_20220615_113300-791c3f3e.pt | upernet模型在开源社区上的upernet_r18_512x512_160k_ade20k_20220615_113300-791c3f3e.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_160k_ade20k/upernet_r50_512x512_160k_ade20k_20200615_184328-8534de8d.pt | upernet模型在开源社区上的upernet_r50_512x512_160k_ade20k_20200615_184328-8534de8d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_160k_ade20k/upernet_r101_512x512_160k_ade20k_20200615_161951-91b32684.pt | upernet模型在开源社区上的upernet_r101_512x512_160k_ade20k_20200615_161951-91b32684.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_20k_voc12aug/upernet_r18_512x512_20k_voc12aug_20220614_123910-ed66e455.pt | upernet模型在开源社区上的upernet_r18_512x512_20k_voc12aug_20220614_123910-ed66e455.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_20k_voc12aug/upernet_r50_512x512_20k_voc12aug_20200617_165330-5b5890a7.pt | upernet模型在开源社区上的upernet_r50_512x512_20k_voc12aug_20200617_165330-5b5890a7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_20k_voc12aug/upernet_r101_512x512_20k_voc12aug_20200617_165629-f14e7f27.pt | upernet模型在开源社区上的upernet_r101_512x512_20k_voc12aug_20200617_165629-f14e7f27.pt的下载链接| -| 开发引入 | / |MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_40k_voc12aug/upernet_r18_512x512_40k_voc12aug_20220614_153605-fafeb868.pt | upernet模型在开源社区上的upernet_r18_512x512_40k_voc12aug_20220614_153605-fafeb868.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_40k_voc12aug/upernet_r50_512x512_40k_voc12aug_20200613_162257-ca9bcc6b.pt | upernet模型在开源社区上的upernet_r50_512x512_40k_voc12aug_20200613_162257-ca9bcc6b.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/upernet/README.md|MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_40k_voc12aug/upernet_r101_512x512_40k_voc12aug_20200613_163549-e26476ac.pt | upernet模型在开源社区上的upernet_r101_512x512_40k_voc12aug_20200613_163549-e26476ac.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_mln_512x512_80k_ade20k/upernet_vit-b16_mln_512x512_80k_ade20k_20210624_130547-0403cee1.pt | vit模型在开源社区上的upernet_vit-b16_mln_512x512_80k_ade20k_20210624_130547-0403cee1.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_mln_512x512_160k_ade20k/upernet_vit-b16_mln_512x512_160k_ade20k_20210624_130547-852fa768.pt | vit模型在开源社区上的upernet_vit-b16_mln_512x512_160k_ade20k_20210624_130547-852fa768.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_ln_mln_512x512_160k_ade20k/upernet_vit-b16_ln_mln_512x512_160k_ade20k_20210621_172828-f444c077.pt | vit模型在开源社区上的upernet_vit-b16_ln_mln_512x512_160k_ade20k_20210621_172828-f444c077.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_512x512_80k_ade20k/upernet_deit-s16_512x512_80k_ade20k_20210624_095228-afc93ec2.pt | vit模型在开源社区上的upernet_deit-s16_512x512_80k_ade20k_20210624_095228-afc93ec2.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_512x512_160k_ade20k/upernet_deit-s16_512x512_160k_ade20k_20210621_160903-5110d916.pt | vit模型在开源社区上的upernet_deit-s16_512x512_160k_ade20k_20210621_160903-5110d916.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_mln_512x512_160k_ade20k/upernet_deit-s16_mln_512x512_160k_ade20k_20210621_161021-fb9a5dfb.pt | vit模型在开源社区上的upernet_deit-s16_mln_512x512_160k_ade20k_20210621_161021-fb9a5dfb.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_ln_mln_512x512_160k_ade20k/upernet_deit-s16_ln_mln_512x512_160k_ade20k_20210621_161021-c0cd652f.pt | vit模型在开源社区上的upernet_deit-s16_ln_mln_512x512_160k_ade20k_20210621_161021-c0cd652f.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_512x512_80k_ade20k/upernet_deit-b16_512x512_80k_ade20k_20210624_130529-1e090789.pt | vit模型在开源社区上的upernet_deit-b16_512x512_80k_ade20k_20210624_130529-1e090789.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_512x512_160k_ade20k/upernet_deit-b16_512x512_160k_ade20k_20210621_180100-828705d7.pt | vit模型在开源社区上的upernet_deit-b16_512x512_160k_ade20k_20210621_180100-828705d7.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_mln_512x512_160k_ade20k/upernet_deit-b16_mln_512x512_160k_ade20k_20210621_191949-4e1450f3.pt | vit模型在开源社区上的upernet_deit-b16_mln_512x512_160k_ade20k_20210621_191949-4e1450f3.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/configs/vit/README.md|MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_ln_mln_512x512_160k_ade20k/upernet_deit-b16_ln_mln_512x512_160k_ade20k_20210623_153535-8a959c14.pt | vit模型在开源社区上的upernet_deit-b16_ln_mln_512x512_160k_ade20k_20210623_153535-8a959c14.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/Dockerfile|MMseg-swin/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | Dockerfile在开源社区上的公钥下载配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/Dockerfile|MMseg-swin/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | Dockerfile在开源社区上的公钥下载配置| -| 开发引入 | / |MMseg-swin/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | Dockerfile在开源社区上的torch下载配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/Dockerfile|MMseg-swin/docker/Dockerfile | https://github.com/open-mmlab/mmsegmentation.git | Dockerfile在开源社区上的mmsegmentation.git下载配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/serve/config.properties|MMseg-swin/docker/serve/config.properties | http://0.0.0.0:8080 | config.properties在开源社区上的inference_address配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/serve/config.properties|MMseg-swin/docker/serve/config.properties | http://0.0.0.0:8081 | config.properties在开源社区上的management_address配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/serve/config.properties|MMseg-swin/docker/serve/config.properties | http://0.0.0.0:8082 | config.properties在开源社区上的inference_address配置| -| 开发引入 | / |MMseg-swin/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | Dockerfile在开源社区上的torch下载配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/5_deployment.md|MMseg-swin/docs/en/conf.py | https://mmsegmentation.readthedocs.io/en/latest | en_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/modelzoo_statistics.md|MMseg-swin/docs/en/conf.py | https://github.com/open-mmlab/mmsegmentation/blob/master | en_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/Dockerfile|MMseg-swin/docs/en/conf.py | https://github.com/open-mmlab/mmsegmentation | en_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/tests/test_models/test_backbones/test_blocks.py|MMseg-swin/docs/en/conf.py | https://github.com/open-mmlab/mmcv | en_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/modelzoo_statistics.md|MMseg-swin/docs/en/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master | en_stat.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://mmsegmentation.readthedocs.io/zh-CN/latest | zh_cn_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/modelzoo_statistics.md|MMseg-swin/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmsegmentation/blob/master | zh_cn_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docker/Dockerfile|MMseg-swin/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmsegmentation | zh_cn_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/tests/test_models/test_backbones/test_blocks.py|MMseg-swin/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmcv | zh_cn_conf.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/modelzoo_statistics.md|MMseg-swin/docs/zh_cn/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master | zh_cn_stat.py在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/tests/test_models/test_backbones/test_blocks.py|MMseg-swin/mmcv/CITATION.cff | https://github.com/open-mmlab/mmcv | mmcv_CITATION在开源社区的url声明| -| 开发引入 | / |MMseg-swin/mmcv/Jenkinsfile | https://mirrors.aliyun.com/pypi/simple | pypi镜像在开源社区上的url配置| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/LICENSE|MMseg-swin/mmcv/LICENSE | http://www.apache.org/licenses/ | mmcv_LICENSE在开源社区的url声明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/LICENSE|MMseg-swin/mmcv/LICENSE | http://www.apache.org/licenses/LICENSE-2.0 | mmcv_LICENSE在开源社区的url声明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/5_deployment.md|MMseg-swin/mmcv/setup.py | https://github.com/open-mmlab/mmdeploy | setuptools在开源社区的mmdeploy置选项| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/5_deployment.md|MMseg-swin/mmcv/setup.py | https://github.com/open-mmlab/mmdeploy | setuptools在开源社区的mmdeploy置选项| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/tests/test_models/test_backbones/test_blocks.py|MMseg-swin/mmcv/setup.py | https://github.com/open-mmlab/mmcv | setuptools在开源社区的url配置选项| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/setup.py | MMseg-swin/mmcv/setup.py | openmmlab@gmail.com | setuptools在开源社区的author邮箱的配置选项| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_batch256_imagenet_20210208-4271cd6c.pth | "vgg11"模型在开源社区上的vgg11_batch256_imagenet_20210208-4271cd6c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_batch256_imagenet_20210208-4d1d6080.pth | "vgg13"模型在开源社区上的vgg13_batch256_imagenet_20210208-4d1d6080.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | "vgg16"模型在开源社区上的vgg16_batch256_imagenet_20210208-db26f1a5.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_batch256_imagenet_20210208-e6920e4a.pth | "vgg19"模型在开源社区上的vgg19_batch256_imagenet_20210208-e6920e4a.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_batch256_imagenet_20210207-f244902c.pth | "vgg11_bn"模型在开源社区上的vgg11_bn_batch256_imagenet_20210207-f244902c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth | "vgg13_bn"模型在开源社区上的vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth | "vgg16_bn"模型在开源社区上的vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_batch256_imagenet_20210208-da620c4f.pth | "vgg19_bn"模型在开源社区上的vgg19_bn_batch256_imagenet_20210208-da620c4f.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_8xb32_in1k_20210831-fbbb1da6.pth | "resnet18"模型在开源社区上的resnet18_8xb32_in1k_20210831-fbbb1da6.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_8xb32_in1k_20210831-f257d4e6.pth | "resnet34"模型在开源社区上的resnet34_8xb32_in1k_20210831-f257d4e6.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth | "resnet50"模型在开源社区上的resnet50_8xb32_in1k_20210831-ea4938fc.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_8xb32_in1k_20210831-539c63f8.pth | "resnet101"模型在开源社区上的resnet101_8xb32_in1k_20210831-539c63f8.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_8xb32_in1k_20210901-4d7582fa.pth | "resnet152"模型在开源社区上的resnet152_8xb32_in1k_20210901-4d7582fa.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_b32x8_imagenet_20210531-db14775a.pth | "resnet50_v1d"模型在开源社区上的resnetv1d50_b32x8_imagenet_20210531-db14775a.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_b32x8_imagenet_20210531-6e13bcd3.pth | "resnet101_v1d"模型在开源社区上的resnetv1d101_b32x8_imagenet_20210531-6e13bcd3.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_b32x8_imagenet_20210531-278cf22a.pth | "resnet152_v1d"模型在开源社区上的resnetv1d152_b32x8_imagenet_20210531-278cf22a.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth | "resnext50_32x4d"模型在开源社区上的resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth | "resnext101_32x4d"模型在开源社区上的resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_b32x8_imagenet_20210506-23a247d5.pth | "resnext101_32x8d"模型在开源社区上的resnext101_32x8d_b32x8_imagenet_20210506-23a247d5.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth | "resnext152_32x4d"模型在开源社区上的resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | "se-resnet50"模型在开源社区上的se-resnet50_batch256_imagenet_20200804-ae206104.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | "se-resnet101"模型在开源社区上的se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | "resnest50"模型在开源社区上的resnest50_imagenet_converted-1ebf0afe.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | "resnest101"模型在开源社区上的resnest101_imagenet_converted-032caa52.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | "resnest200"模型在开源社区上的resnest200_imagenet_converted-581a60f2.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | "resnest269"模型在开源社区上的resnest269_imagenet_converted-59930960.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | "shufflenet_v1"模型在开源社区上的shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | "shufflenet_v2"模型在开源社区上的shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | "mobilenet_v2"模型在开源社区上的mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_small-8427ecf0.pth | "mobilenet_v3_small"模型在开源社区上的mobilenet_v3_small-8427ecf0.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_large-3ea3c186.pth | "mobilenet_v3_large"模型在开源社区上的mobilenet_v3_large-3ea3c186.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A0_3rdparty_4xb64-coslr-120e_in1k_20210909-883ab98c.pth | "repvgg_A0"模型在开源社区上的repvgg-A0_3rdparty_4xb64-coslr-120e_in1k_20210909-883ab98c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth | "repvgg_A1"模型在开源社区上的repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A2_3rdparty_4xb64-coslr-120e_in1k_20210909-97d7695a.pth | "repvgg_A2"模型在开源社区上的repvgg-A2_3rdparty_4xb64-coslr-120e_in1k_20210909-97d7695a.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B0_3rdparty_4xb64-coslr-120e_in1k_20210909-446375f4.pth | "repvgg_B0"模型在开源社区上的repvgg-B0_3rdparty_4xb64-coslr-120e_in1k_20210909-446375f4.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1_3rdparty_4xb64-coslr-120e_in1k_20210909-750cdf67.pth | "repvgg_B1"模型在开源社区上的repvgg-B1_3rdparty_4xb64-coslr-120e_in1k_20210909-750cdf67.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g2_3rdparty_4xb64-coslr-120e_in1k_20210909-344f6422.pth | "repvgg_B1g2"模型在开源社区上的repvgg-B1g2_3rdparty_4xb64-coslr-120e_in1k_20210909-344f6422.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g4_3rdparty_4xb64-coslr-120e_in1k_20210909-d4c1a642.pth | "repvgg_B1g4"模型在开源社区上的repvgg-B1g4_3rdparty_4xb64-coslr-120e_in1k_20210909-d4c1a642.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2_3rdparty_4xb64-coslr-120e_in1k_20210909-bd6b937c.pth | "repvgg_B2"模型在开源社区上的repvgg-B2_3rdparty_4xb64-coslr-120e_in1k_20210909-bd6b937c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth | "repvgg_B2g4"模型在开源社区上的repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-dda968bf.pth | "repvgg_B3"模型在开源社区上的repvgg-B3_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-dda968bf.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-4e54846a.pth | "repvgg_B3g4"模型在开源社区上的repvgg-B3g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-4e54846a.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-D2se_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-cf3139b7.pth | "repvgg_D2se"模型在开源社区上的repvgg-D2se_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-cf3139b7.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/res2net/res2net101-w26-s4_3rdparty_8xb32_in1k_20210927-870b6c36.pth | "res2net101_w26"模型在开源社区上的res2net101-w26-s4_3rdparty_8xb32_in1k_20210927-870b6c36.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w14-s8_3rdparty_8xb32_in1k_20210927-bc967bf1.pth | "res2net50_w14"模型在开源社区上的res2net50-w14-s8_3rdparty_8xb32_in1k_20210927-bc967bf1.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w26-s8_3rdparty_8xb32_in1k_20210927-f547a94b.pth | "res2net50_w26"模型在开源社区上的res2net50-w26-s8_3rdparty_8xb32_in1k_20210927-f547a94b.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_tiny_224_b16x64_300e_imagenet_20210616_090925-66df6be6.pth | "swin_tiny"模型在开源社区上的swin_tiny_224_b16x64_300e_imagenet_20210616_090925-66df6be6.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_small_224_b16x64_300e_imagenet_20210615_110219-7f9d988b.pth | "swin_small"模型在开源社区上的swin_small_224_b16x64_300e_imagenet_20210615_110219-7f9d988b.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window7_224_22kto1k-f967f799.pth | "swin_base"模型在开源社区上的swin_base_patch4_window7_224_22kto1k-f967f799.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window7_224_22kto1k-5f0996db.pth | "swin_large"模型在开源社区上的swin_large_patch4_window7_224_22kto1k-5f0996db.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-14_3rdparty_8xb64_in1k_20210928-b7c09b62.pth | "t2t_vit_t_14"模型在开源社区上的t2t-vit-t-14_3rdparty_8xb64_in1k_20210928-b7c09b62.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-19_3rdparty_8xb64_in1k_20210928-7f1478d5.pth | "t2t_vit_t_19"模型在开源社区上的t2t-vit-t-19_3rdparty_8xb64_in1k_20210928-7f1478d5.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-24_3rdparty_8xb64_in1k_20210928-fe95a61b.pth | "t2t_vit_t_24"模型在开源社区上的t2t-vit-t-24_3rdparty_8xb64_in1k_20210928-fe95a61b.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/tnt/tnt-small-p16_3rdparty_in1k_20210903-c56ee7df.pth | "tnt_small"模型在开源社区上的tnt-small-p16_3rdparty_in1k_20210903-c56ee7df.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-base-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-98e8652b.pth | "vit_base_p16"模型在开源社区上的vit-base-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-98e8652b.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-base-p32_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-9cea8599.pth | "vit_base_p32"模型在开源社区上的vit-base-p32_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-9cea8599.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-large-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-b20ba619.pth | "vit_large_p16"模型在开源社区上的vit-large-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-b20ba619.pt的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | "vgg16_caffe"模型在开源社区上的vgg16_caffe-292e1171.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | "detectron/resnet50_caffe"模型在开源社区上的resnet50_caffe-788b5fa3.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | "detectron2/resnet50_caffe"模型在开源社区上的resnet50_msra-5891d200.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | "detectron/resnet101_caffe"模型在开源社区上的resnet101_caffe-3ad79236.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | "detectron2/resnet101_caffe"模型在开源社区上的resnet101_msra-6cc46731.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | "detectron2/resnext101_32x8d"模型在开源社区上的resnext101_32x8d-1516f1aa.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | "resnext50_32x4d"模型在开源社区上的resnext50-32x4d-0ab1a123.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | "resnext101_32x4d"模型在开源社区上的resnext101_32x4d-a5af3160.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | "resnext101_64x4d"模型在开源社区上的resnext101_64x4d-ee2c6f71.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | "contrib/resnet50_gn"模型在开源社区上的resnet50_gn_thangvubk-ad1730dd.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | "detectron/resnet50_gn"模型在开源社区上的resnet50_gn-9186a21c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | "detectron/resnet101_gn"模型在开源社区上的resnet101_gn-cac0ab98.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | "jhu/resnet50_gn_ws"模型在开源社区上的resnet50_gn_ws-15beedd8.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | "jhu/resnet101_gn_ws"模型在开源社区上的resnet101_gn_ws-3e3c308c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | "jhu/resnext50_32x4d_gn_ws"模型在开源社区上的resnext50_32x4d_gn_ws-0d87ac85.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | "jhu/resnext101_32x4d_gn_ws"模型在开源社区上的resnext101_32x4d_gn_ws-34ac1a9e.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | "jhu/resnext50_32x4d_gn"模型在开源社区上的resnext50_32x4d_gn-c7e8b754.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | "jhu/resnext101_32x4d_gn"模型在开源社区上的resnext101_32x4d_gn-ac3bb84e.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | "msra/hrnetv2_w18_small"模型在开源社区上的hrnetv2_w18_small-b5a04e21.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | "msra/hrnetv2_w18"模型在开源社区上的hrnetv2_w18-00eb2006.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | "msra/hrnetv2_w32"模型在开源社区上的hrnetv2_w32-dc9eeb4f.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | "msra/hrnetv2_w40"模型在开源社区上的hrnetv2_w40-ed0b031c.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | "msra/hrnetv2_w48"模型在开源社区上的hrnetv2_w48-d2186c55.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | "bninception_caffe"模型在开源社区上的bn_inception_caffe-ed2e8665.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | "kin400/i3d_r50_f32s2_k400"模型在开源社区上的i3d_r50_f32s2_k400-2c57e077.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | "kin400/nl3d_r50_f32s2_k400"模型在开源社区上的nl3d_r50_f32s2_k400-fa7e7caa.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | "res2net101_v1d_26w_4s"模型在开源社区上的res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | "regnetx_400mf"模型在开源社区上的regnetx_400mf-a5b10d96.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | "regnetx_800mf"模型在开源社区上的regnetx_800mf-1f4be4c7.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | "regnetx_1.6gf"模型在开源社区上的regnetx_1.6gf-5791c176.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | "regnetx_3.2gf"模型在开源社区上的regnetx_3.2gf-c2599b0f.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | "regnetx_4.0gf"模型在开源社区上的regnetx_4.0gf-a88f671e.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | "regnetx_6.4gf"模型在开源社区上的regnetx_6.4gf-006af45d.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | "regnetx_8.0gf"模型在开源社区上的regnetx_8.0gf-3c68abe7.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | "regnetx_12gf"模型在开源社区上的regnetx_12gf-4c2a3350.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | "resnet18_v1c"模型在开源社区上的resnet18_v1c-b5776b93.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | "resnet50_v1c"模型在开源社区上的resnet50_v1c-2cccc1ad.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | "resnet101_v1c"模型在开源社区上的resnet101_v1c-e67eebb6.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | "mmedit/vgg16"模型在开源社区上的vgg_state_dict.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | "mmedit/res34_en_nomixup"模型在开源社区上的model_best_resnet34_En_nomixup.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | "mmedit/mobilenet_v2"模型在开源社区上的mobilenet_v2.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | "contrib/mobilenet_v3_large"模型在开源社区上的mobilenet_v3_large-bc2c3fd3.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | "contrib/mobilenet_v3_small"模型在开源社区上的mobilenet_v3_small-47085aa1.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | "resnest50"模型在开源社区上的resnest50_d2-7497a55b.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | "resnest101"模型在开源社区上的resnest101_d2-f3b931b2.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | "resnest200"模型在开源社区上的resnest200_d2-ca88e41f.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | "darknet53"模型在开源社区上的darknet53-a628ea1b.pth的下载链接| -| 开发引入 | / |MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmdetection/v2.0/third_party/mobilenet_v2_batch256_imagenet-ff34753d.pth | "mmdet/mobilenet_v2"模型在开源社区上的mobilenet_v2_batch256_imagenet-ff34753d.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/alexnet-owt-7be5be79.pth | "alexnet"模型在开源社区上的alexnet-owt-7be5be79.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet121-a639ec97.pth | "densenet121"模型在开源社区上的densenet121-a639ec97.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet169-b2777c0a.pth | "densenet169"模型在开源社区上的densenet169-b2777c0a.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet201-c1103571.pth | "densenet201"模型在开源社区上的densenet201-c1103571.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet161-8d451a50.pth | "densenet161"模型在开源社区上的densenet161-8d451a50.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b0_rwightman-3dd342df.pth | "efficientnet_b0"模型在开源社区上的efficientnet_b0_rwightman-3dd342df.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b1_rwightman-533bc792.pth | "efficientnet_b1"模型在开源社区上的efficientnet_b1_rwightman-533bc792.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b2_rwightman-bcdf34b7.pth | "efficientnet_b2"模型在开源社区上的efficientnet_b2_rwightman-bcdf34b7.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b3_rwightman-cf984f9c.pth | "efficientnet_b3"模型在开源社区上的efficientnet_b3_rwightman-cf984f9c.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b4_rwightman-7eb33cd5.pth | "efficientnet_b4"模型在开源社区上的efficientnet_b4_rwightman-7eb33cd5.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b5_lukemelas-b6417697.pth | "efficientnet_b5"模型在开源社区上的efficientnet_b5_lukemelas-b6417697.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b6_lukemelas-c76e70fd.pth | "efficientnet_b6"模型在开源社区上的efficientnet_b6_lukemelas-c76e70fd.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b7_lukemelas-dcc49843.pth | "efficientnet_b7"模型在开源社区上的efficientnet_b7_lukemelas-dcc49843.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/googlenet-1378be20.pth | "googlenet"模型在开源社区上的googlenet-1378be20.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/inception_v3_google-0cc3c7bd.pth | "inception_v3_google"模型在开源社区上的inception_v3_google-0cc3c7bd.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | "mobilenet_v2"模型在开源社区上的mobilenet_v2-b0353104.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/mobilenet_v3_large-8738ca79.pth | "mobilenet_v3_large"模型在开源社区上的mobilenet_v3_large-8738ca79.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/mobilenet_v3_small-047dcff4.pth | "mobilenet_v3_small"模型在开源社区上的mobilenet_v3_small-047dcff4.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_400mf-c65dace8.pth | "regnet_y_400mf"模型在开源社区上的regnet_y_400mf-c65dace8.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_800mf-1b27b58c.pth | "regnet_y_800mf"模型在开源社区上的regnet_y_800mf-1b27b58c.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_1_6gf-b11a554e.pth | "regnet_y_1_6gf"模型在开源社区上的regnet_y_1_6gf-b11a554e.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_3_2gf-b5a9779c.pth | "regnet_y_3_2gf"模型在开源社区上的regnet_y_3_2gf-b5a9779c.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_8gf-d0d0e4a8.pth | "regnet_y_8gf"模型在开源社区上的regnet_y_8gf-d0d0e4a8.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_16gf-9e6ed7dd.pth | "regnet_y_16gf"模型在开源社区上的regnet_y_16gf-9e6ed7dd.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_32gf-4dee3f7a.pth | "regnet_y_32gf"模型在开源社区上的regnet_y_32gf-4dee3f7a.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_400mf-adf1edd5.pth | "regnet_x_400mf"模型在开源社区上的regnet_x_400mf-adf1edd5.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_800mf-ad17e45c.pth | "regnet_x_800mf"模型在开源社区上的regnet_x_800mf-ad17e45c.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_1_6gf-e3633e7f.pth | "regnet_x_1_6gf"模型在开源社区上的regnet_x_1_6gf-e3633e7f.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_3_2gf-f342aeae.pth | "regnet_x_3_2gf"模型在开源社区上的regnet_x_3_2gf-f342aeae.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_8gf-03ceed89.pth | "regnet_x_8gf"模型在开源社区上的regnet_x_8gf-03ceed89.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_16gf-2007eb11.pth | "regnet_x_16gf"模型在开源社区上的regnet_x_16gf-2007eb11.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_32gf-9d47f8d0.pth | "regnet_x_32gf"模型在开源社区上的regnet_x_32gf-9d47f8d0.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet18-f37072fd.pth | "resnet18"模型在开源社区上的resnet18-f37072fd.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet34-b627a593.pth | "resnet34"模型在开源社区上的resnet34-b627a593.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet50-0676ba61.pth | "resnet50"模型在开源社区上的resnet50-0676ba61.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet101-63fe2227.pth | "resnet101"模型在开源社区上的resnet101-63fe2227.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet152-394f9c45.pth | "resnet152"模型在开源社区上的resnet152-394f9c45.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | "resnext50_32x4d"模型在开源社区上的resnext50_32x4d-7cdf4587.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | "resnext101_32x8d"模型在开源社区上的resnext101_32x8d-8ba56ff5.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | "wide_resnet50_2"模型在开源社区上的wide_resnet50_2-95faca4d.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | "wide_resnet101_2"模型在开源社区上的wide_resnet101_2-32ee1156.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | "shufflenetv2_x0.5"模型在开源社区上的shufflenetv2_x0.5-f707e7126e.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | "shufflenetv2_x1.0"模型在开源社区上的shufflenetv2_x1-5666bf0f80.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/squeezenet1_0-b66bff10.pth | "squeezenet1_0"模型在开源社区上的squeezenet1_0-b66bff10.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/squeezenet1_1-b8a52dc0.pth | "squeezenet1_1"模型在开源社区上的squeezenet1_1-b8a52dc0.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg11-8a719046.pth | "vgg11"模型在开源社区上的vgg11-8a719046.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg13-19584684.pth | "vgg13"模型在开源社区上的vgg13-19584684.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg16-397923af.pth | "vgg16"模型在开源社区上的vgg16-397923af.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | "vgg19"模型在开源社区上的vgg19-dcbb9e9d.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | "vgg11_bn"模型在开源社区上的vgg11_bn-6002323d.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | "vgg13_bn"模型在开源社区上的vgg13_bn-abd245e5.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | "vgg16_bn"模型在开源社区上的vgg16_bn-6c64b313.pth的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/.circleci/test.yml |MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | "vgg19_bn"模型在开源社区上的vgg19_bn-c79401a0.pt的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/5_deployment.md|MMseg-swin/tools/deploy_test.py | https://github.com/open-mmlab/mmdeploy | msg+='MMDeploy模型在开源社区上的mmdeplo的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/5_deployment.md|MMseg-swin/tools/onnx2tensorrt.py | https://github.com/open-mmlab/mmdeploy | msg+='MMDeploy模型在开源社区上的mmdeplo的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/docs/zh_cn/user_guides/5_deployment.md|MMseg-swin/tools/pytorch2onnx.py | https://github.com/open-mmlab/mmdeploy | msg+='MMDeploy模型在开源社区上的mmdeplo的下载链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/tation/requirements/docs.txt|MMseg-swin/mmcv/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | pytorch_sphinx_theme在开源社区上的git链接| -| 开发引入 | / |MMseg-swin/requirements/docs.txt | https://github.com/gaotongxiao/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | pytorch_sphinx_theme在开源社区上的git链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/setup.py |MMseg-swin/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关依赖| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/utils/util_distribution.py |MMseg-swin/mmseg/utils/util_distribution.py | https://pytorch.org/docs/stable/generated/torch.nn.parallel.DistributedDataParallel.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/utils/set_env.py |MMseg-swin/mmseg/utils/set_env.py | https://github.com/pytorch/pytorch/blob/master/torch/distributed/run.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/utils/misc.py |MMseg-swin/mmseg/utils/misc.py | https://github.com/open-mmlab/mmdetection/blob/dev-v2.20.0/mmdet/utils/misc.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/utils/self_attention_block.py |MMseg-swin/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/utils/make_divisible.py |MMseg-swin/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/utils/make_divisible.py |MMseg-swin/mmseg/models/utils/make_divisible.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/segmentors/base.py |MMseg-swin/mmseg/models/segmentors/base.py | https://github.com/open-mmlab/mmdetection/issues/5844 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/necks/mla_neck.py |MMseg-swin/mmseg/models/necks/mla_neck.py | https://arxiv.org/abs/2012.15840 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/necks/jpu.py |MMseg-swin/mmseg/models/necks/jpu.py | https://arxiv.org/abs/1903.11816 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/necks/ic_neck.py |MMseg-swin/mmseg/models/necks/ic_neck.py | https://arxiv.org/abs/1704.08545 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/necks/fpn.py |MMseg-swin/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/lovasz_loss.py |MMseg-swin/mmseg/models/losses/lovasz_loss.py | https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytorch/lovasz_losses.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/lovasz_loss.py |MMseg-swin/mmseg/models/losses/lovasz_loss.py | https://arxiv.org/abs/1705.08790 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/focal_loss.py |MMseg-swin/mmseg/models/losses/focal_loss.py | https://github.com/open-mmlab/mmdetection | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/focal_loss.py |MMseg-swin/mmseg/models/losses/focal_loss.py |https://arxiv.org/abs/1708.02002 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/dice_loss.py |MMseg-swin/mmseg/models/losses/dice_loss.py | https://github.com/LikeLy-Journey/SegmenTron/blob/master/segmentron/solver/loss.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/dice_loss.py |MMseg-swin/mmseg/models/losses/dice_loss.py | https://arxiv.org/abs/1606.04797 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/losses/cross_entropy_loss.py |MMseg-swin/mmseg/models/losses/cross_entropy_loss.py | https://github.com/pytorch/pytorch/blob/56b43f4fec1f76953f15a627694d4bba34588969/torch/nn/functional.py#L2660 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/uper_head.py |MMseg-swin/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/stdc_head.py |MMseg-swin/mmseg/models/decode_heads/stdc_head.py | https://arxiv.org/abs/2104.13188 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/setr_up_head.py |MMseg-swin/mmseg/models/decode_heads/setr_up_head.py | https://arxiv.org/pdf/2012.15840.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/setr_mla_head.py |MMseg-swin/mmseg/models/decode_heads/setr_mla_head.py | https://arxiv.org/pdf/2012.15840.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/sep_fcn_head.py |MMseg-swin/mmseg/models/decode_heads/sep_fcn_head.py | https://arxiv.org/abs/1902.04502 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/sep_aspp_head.py |MMseg-swin/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/segmenter_mask_head.py |MMseg-swin/mmseg/models/decode_heads/segmenter_mask_head.py | https://arxiv.org/abs/2105.05633 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/segformer_head.py |MMseg-swin/mmseg/models/decode_heads/segformer_head.py | https://github.com/NVlabs/SegFormer/blob/master/mmseg/models/decode_heads/segformer_head.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/segformer_head.py |MMseg-swin/mmseg/models/decode_heads/segformer_head.py | https://arxiv.org/abs/2105.15203 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/psp_head.py |MMseg-swin/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/psa_head.py |MMseg-swin/mmseg/models/decode_heads/psa_head.py | https://hszhao.github.io/papers/eccv18_psanet.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/point_head.py |MMseg-swin/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/point_head.py |MMseg-swin/mmseg/models/decode_heads/point_head.py | https://arxiv.org/abs/1912.08193 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/ocr_head.py |MMseg-swin/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/nl_head.py |MMseg-swin/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/lraspp_head.py |MMseg-swin/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/knet_head.py |MMseg-swin/mmseg/models/decode_heads/knet_head.py | https://arxiv.org/abs/2106.14855 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/isa_head.py |MMseg-swin/mmseg/models/decode_heads/isa_head.py | https://arxiv.org/abs/1907.12273 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/gc_head.py |MMseg-swin/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/fpn_head.py |MMseg-swin/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/fcn_head.py |MMseg-swin/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/enc_head.py |MMseg-swin/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/ema_head.py |MMseg-swin/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/dpt_head.py |MMseg-swin/mmseg/models/decode_heads/dpt_head.py | https://arxiv.org/abs/2103.13413 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/dnl_head.py |MMseg-swin/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/da_head.py |MMseg-swin/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/cc_head.py |MMseg-swin/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/aspp_head.py |MMseg-swin/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/apc_head.py |MMseg-swin/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/decode_heads/ann_head.py |MMseg-swin/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/vit.py |MMseg-swin/mmseg/models/backbones/vit.py | https://arxiv.org/abs/2010.11929 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/vit.py |MMseg-swin/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/unet.py |MMseg-swin/mmseg/models/backbones/unet.py | https://arxiv.org/abs/1505.04597 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/twins.py |MMseg-swin/mmseg/models/backbones/twins.py | https://arxiv.org/abs/2102.10882 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/twins.py |MMseg-swin/mmseg/models/backbones/twins.py | https://arxiv.org/abs/1512.03385 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/twins.py |MMseg-swin/mmseg/models/backbones/twins.py | https://arxiv.org/abs/1512.03385 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/timm_backbone.py |MMseg-swin/mmseg/models/backbones/timm_backbone.py | https://github.com/rwightman/pytorch-image-models | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/swin.py |MMseg-swin/mmseg/models/backbones/swin.py | https://arxiv.org/abs/2103.14030 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/swin.py |MMseg-swin/mmseg/models/backbones/swin.py | https://github.com/microsoft/Swin-Transformer | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/stdc.py |MMseg-swin/mmseg/models/backbones/stdc.py | https://github.com/MichaelFan01/STDC-Seg | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/stdc.py |MMseg-swin/mmseg/models/backbones/stdc.py | https://arxiv.org/abs/2104.13188 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnext.py |MMseg-swin/mmseg/models/backbones/resnext.py | https://arxiv.org/abs/1611.05431 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnet.py |MMseg-swin/mmseg/models/backbones/resnet.py | https://arxiv.org/abs/1512.03385 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnet.py |MMseg-swin/mmseg/models/backbones/resnet.py | https://arxiv.org/abs/1812.01187 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/resnest.py |MMseg-swin/mmseg/models/backbones/resnest.py | https://arxiv.org/abs/2004.08955 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mobilenet_v3.py |MMseg-swin/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mobilenet_v2.py |MMseg-swin/mmseg/models/backbones/mobilenet_v2.py | https://arxiv.org/abs/1801.04381 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mit.py |MMseg-swin/mmseg/models/backbones/mit.py | https://github.com/open-mmlab/mmcv/pull/1418 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mit.py |MMseg-swin/mmseg/models/backbones/mit.py | https://github.com/pytorch/pytorch/issues/37583 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mit.py |MMseg-swin/mmseg/models/backbones/mit.py | https://arxiv.org/abs/2105.15203 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mae.py |MMseg-swin/mmseg/models/backbones/mae.py | https://github.com/microsoft/unilm/blob/master/beit/modeling_pretrain.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mae.py |MMseg-swin/mmseg/models/backbones/mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/mae.py |MMseg-swin/mmseg/models/backbones/mae.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/icnet.py |MMseg-swin/mmseg/models/backbones/icnet.py | https://arxiv.org/abs/1704.08545 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/hrnet.py |MMseg-swin/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/fast_scnn.py |MMseg-swin/mmseg/models/backbones/fast_scnn.py | https://arxiv.org/abs/1902.04502 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/erfnet.py |MMseg-swin/mmseg/models/backbones/erfnet.py | https://ieeexplore.ieee.org/document/8063438 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/cgnet.py |MMseg-swin/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/bisenetv2.py |MMseg-swin/mmseg/models/backbones/bisenetv2.py | https://arxiv.org/abs/2004.02147 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/bisenetv1.py |MMseg-swin/mmseg/models/backbones/bisenetv1.py | https://arxiv.org/abs/1808.00897 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/beit.py |MMseg-swin/mmseg/models/backbones/beit.py | https://github.com/microsoft/unilm/blob/master/beit/semantic_segmentation/mmcv_custom/checkpoint.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/models/backbones/beit.py |MMseg-swin/mmseg/models/backbones/beit.py | https://github.com/rwightman/pytorch-image-models/blob/master/timm/models/vision_transformer.py#L353 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/pipelines/transforms.py |MMseg-swin/mmseg/datasets/pipelines/transforms.py | https://arxiv.org/abs/1708.04552 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/pipelines/loading.py |MMseg-swin/mmseg/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmsegmentation/pull/1445/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/custom.py |MMseg-swin/mmseg/datasets/custom.py | https://github.com/open-mmlab/mmsegmentation/issues/1415 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/custom.py |MMseg-swin/mmseg/datasets/custom.py | https://github.com/open-mmlab/mmdetection/issues/5844 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/datasets/builder.py |MMseg-swin/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/core/seg/sampler/ohem_pixel_sampler.py |MMseg-swin/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/core/hook/wandblogger_hook.py |MMseg-swin/mmseg/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/artifacts/model-versioning | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/core/hook/wandblogger_hook.py |MMseg-swin/mmseg/core/hook/wandblogger_hook.py | https://docs.wandb.ai/guides/data-vis | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/core/hook/wandblogger_hook.py |MMseg-swin/mmseg/core/hook/wandblogger_hook.py | https://docs.wandb.ai/ref/python/init | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/apis/train.py |MMseg-swin/mmseg/apis/train.py | https://github.com/open-mmlab/mmdetection/issues/6339 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmseg/apis/train.py |MMseg-swin/mmseg/apis/train.py | https://github.com/open-mmlab/mmcv/pull/1193 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/setup.py |MMseg-swin/mmcv/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关依赖| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/setup.py |MMseg-swin/mmcv/setup.py | https://github.com/pytorch/pytorch/pull/45956 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/setup.py |MMseg-swin/mmcv/setup.py | https://github.com/open-mmlab/mmcv/pull/1463 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/video/optflow.py |MMseg-swin/mmcv/mmcv/video/optflow.py | https://github.com/princeton-vl/RAFT/blob/224320502d66c356d88e6c712f38129e60661e80/core/utils/frame_utils.py#L102 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/utils/trace.py |MMseg-swin/mmcv/mmcv/utils/trace.py | https://github.com/pytorch/pytorch/issues/42448 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/utils/registry.py |MMseg-swin/mmcv/mmcv/utils/registry.py | https://mmcv.readthedocs.io/en/latest/understand_mmcv/registry.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/utils/hub.py |MMseg-swin/mmcv/mmcv/utils/hub.py | https://github.com/open-mmlab/mmpose/issues/904 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/utils/hub.py |MMseg-swin/mmcv/mmcv/utils/hub.py | https://github.com/pytorch/pytorch/blob/master/torch/hub.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/utils/hub.py |MMseg-swin/mmcv/mmcv/utils/hub.py | https://s3.amazonaws.com/pytorch/models/resnet18-5c106 | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/iter_based_runner.py |MMseg-swin/mmcv/mmcv/runner/iter_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/profiler.py |MMseg-swin/mmcv/mmcv/runner/hooks/profiler.py | https://pytorch.org/docs/1.8.1/profiler.html#torch.profiler.profile | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/optimizer.py|MMseg-swin/mmcv/mmcv/runner/hooks/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/optimizer.py|MMseg-swin/mmcv/mmcv/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/momentum_updater.py|MMseg-swin/mmcv/mmcv/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/lr_updater.py|MMseg-swin/mmcv/mmcv/runner/hooks/lr_updater.py | https://github.com/fastai/fastai/blob/master/fastai/callback/schedule.py#L128 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/lr_updater.py|MMseg-swin/mmcv/mmcv/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/momentum_updater.py|MMseg-swin/mmcv/mmcv/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/wandb.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/wandb.py | https://docs.wandb.ai/ref/python/init | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/wandb.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/wandb.py | https://docs.wandb.ai | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/segmind.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/segmind.py | https://docs.segmind.com/python-library | 相关依赖| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/neptune.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/neptune.py | https://docs.neptune.ai/api-reference/neptune#init | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/neptune.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/neptune.py | https://docs.neptune.ai | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/mlflow.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/dvclive.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/dvclive.py | https://dvc.org/doc/dvclive | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/dvclive.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/dvclive.py | https://dvc.org/doc/dvclive/api-reference/live#parameters | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/clearml.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/clearml.py | https://clear.ml/docs/latest/docs/ | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/logger/clearml.py|MMseg-swin/mmcv/mmcv/runner/hooks/logger/clearml.py | https://clear.ml/docs/latest/docs/references/sdk/task/#taskinit | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/evaluation.py|MMseg-swin/mmcv/mmcv/runner/hooks/evaluation.py | https://github.com/open-mmlab/mmsegmentation/issues/694 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/evaluation.py|MMseg-swin/mmcv/mmcv/runner/hooks/evaluation.py | https://github.com/open-mmlab/mmdetection/issues/6265 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/fp16_utils.py|MMseg-swin/mmcv/mmcv/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/epoch_based_runner.py|MMseg-swin/mmcv/mmcv/runner/hooks/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/runner/hooks/dist_utils.py|MMseg-swin/mmcv/mmcv/runner/hooks/dist_utils.py | https://github.com/facebookresearch/detectron2/blob/main/detectron2/engine/launch.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/parallel/distributed.py|MMseg-swin/mmcv/mmcv/parallel/distributed.py | https://github.com/open-mmlab/mmsegmentation/issues/1742 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/voxelize.py|MMseg-swin/mmcv/mmcv/ops/voxelize.py | https://github.com/open-mmlab/mmdetection3d/issues/894 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/voxelize.py|MMseg-swin/mmcv/mmcv/ops/voxelize.py | https://github.com/open-mmlab/mmdetection3d/pull/904 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/voxelize.py|MMseg-swin/mmcv/mmcv/ops/voxelize.py | https://arxiv.org/abs/1907.03739 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/upfirdn2d.py|MMseg-swin/mmcv/mmcv/ops/upfirdn2d.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/upfirdn2d.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/upfirdn2d.py|MMseg-swin/mmcv/mmcv/ops/upfirdn2d.py | https://www.mathworks.com/help/signal/ref/upfirdn.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/tin_shift.py|MMseg-swin/mmcv/mmcv/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/tin_shift.py|MMseg-swin/mmcv/mmcv/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn | 作者邮箱| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/tin_shift.py|MMseg-swin/mmcv/mmcv/ops/tin_shift.py | sjqian@cse.cuhk.edu.hk | 作者邮箱| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/tin_shift.py|MMseg-swin/mmcv/mmcv/ops/tin_shift.py | yuliu@ee.cuhk.edu.hk | 作者邮箱| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/tin_shift.py|MMseg-swin/mmcv/mmcv/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/tin_shift.py|MMseg-swin/mmcv/mmcv/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/three_nn.py|MMseg-swin/mmcv/mmcv/ops/three_nn.py | https://arxiv.org/abs/1706.02413 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/three_interpolate.py|MMseg-swin/mmcv/mmcv/ops/three_interpolate.py | https://arxiv.org/abs/1706.02413 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/sparse_functional.py|MMseg-swin/mmcv/mmcv/ops/sparse_functional.py | https://www.mdpi.com/1424-8220/18/10/3337 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/saconv.py|MMseg-swin/mmcv/mmcv/ops/saconv.py | https://arxiv.org/abs/2006.02334 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/rotated_feature_align.py|MMseg-swin/mmcv/mmcv/ops/rotated_feature_align.py | https://arxiv.org/abs/1908.05612 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/roipoint_pool3d.py|MMseg-swin/mmcv/mmcv/ops/roipoint_pool3d.py | https://arxiv.org/pdf/1907.03670.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/roiaware_pool3d.py|MMseg-swin/mmcv/mmcv/ops/roiaware_pool3d.py | https://arxiv.org/pdf/1907.03670.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/riroi_align_rotated.py|MMseg-swin/mmcv/mmcv/ops/riroi_align_rotated.py | https://arxiv.org/abs/2103.07733 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/psa_mask.py|MMseg-swin/mmcv/mmcv/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/points_in_boxes.py|MMseg-swin/mmcv/mmcv/ops/points_in_boxes.py | https://github.com/open-mmlab/mmdetection3d/issues/305 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/point_sample.py|MMseg-swin/mmcv/mmcv/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/nms.py|MMseg-swin/mmcv/mmcv/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/nms.py|MMseg-swin/mmcv/mmcv/ops/nms.py | https://github.com/pytorch/vision/blob/505cd6957711af790211896d32b40291bea1bc21/torchvision/ops/boxes.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/multi_scale_deform_attn.py|MMseg-swin/mmcv/mmcv/ops/multi_scale_deform_attn.py | https://arxiv.org/pdf/2010.04159.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/knn.py|MMseg-swin/mmcv/mmcv/ops/knn.py | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/pointops/src/knnquery_heap | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/fused_bias_leakyrelu.py|MMseg-swin/mmcv/mmcv/ops/fused_bias_leakyrelu.py | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_act.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/fused_bias_leakyrelu.py|MMseg-swin/mmcv/mmcv/ops/fused_bias_leakyrelu.py | http://arxiv.org/abs/1912.04958 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/diff_iou_rotated.py|MMseg-swin/mmcv/mmcv/ops/diff_iou_rotated.py | https://github.com/lilanxiao/Rotated_IoU/blob/master/box_intersection_2d.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/diff_iou_rotated.py|MMseg-swin/mmcv/mmcv/ops/diff_iou_rotated.py | https://github.com/lilanxiao/Rotated_IoU/blob/master/oriented_iou_loss.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/diff_iou_rotated.py|MMseg-swin/mmcv/mmcv/ops/diff_iou_rotated.py | https://en.wikipedia.org/wiki/Line%E2%80%93line_intersection | 相关参考| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/deform_conv.py|MMseg-swin/mmcv/mmcv/ops/deform_conv.py | https://arxiv.org/pdf/1703.06211.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/deform_conv.py|MMseg-swin/mmcv/mmcv/ops/deform_conv.py | https://github.com/open-mmlab/mmcv/issues/1440 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/tensorrt/trt_serialize.hpp|MMseg-swin/mmcv/mmcv/ops/csrc/tensorrt/trt_serialize.hpp | https://github.com/NVIDIA/TensorRT/blob/master/plugin/common/serialize.hpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/tensorrt/trt_instance_norm.hpp|MMseg-swin/mmcv/mmcv/ops/csrc/tensorrt/trt_instance_norm.hpp | https://github.com/NVIDIA/TensorRT/blob/master/plugin/instanceNormalizationPlugin/instanceNormalizationPlugin.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/tensorrt/plugins/trt_instance_norm.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/tensorrt/plugins/trt_instance_norm.cpp | https://github.com/NVIDIA/TensorRT/blob/master/plugin/instanceNormalizationPlugin/instanceNormalizationPlugin.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/tensorrt/plugins/trt_grid_sampler_kernel.cu|MMseg-swin/mmcv/mmcv/ops/csrc/tensorrt/plugins/trt_grid_sampler_kernel.cu | https://github.com/pytorch/pytorch/blob/ec683299ebabf297a3504c76248d37be830e4342/aten/src/ATen/native/cuda/GridSampler.cuh | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/tensorrt/plugins/trt_grid_sampler_kernel.cu|MMseg-swin/mmcv/mmcv/ops/csrc/tensorrt/plugins/trt_grid_sampler_kernel.cu | https://github.com/pytorch/pytorch/blob/ec683299ebabf297a3504c76248d37be830e4342/aten/src/ATen/native/cuda/GridSampler.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/upfirdn2d.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/upfirdn2d.cpp | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/upfirdn2d.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/three_nn.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/three_nn.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/interpolate.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/three_interpolate.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/three_interpolate.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/interpolate.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/rotated_feature_align.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/rotated_feature_align.cpp | https://github.com/SJTU-Thinklab-Det/r3det-on-mmdetection/blob/master/mmdet/ops/fr/src/feature_refine_cuda.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/roipoint_pool3d.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/roipoint_pool3d.cpp | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/roipoint_pool3d/src/roipoint_pool3d.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/psamask.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/psamask.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/pixel_group.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/pixel_group.cpp | https://github.com/WenmuZhou/PAN.pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/nms_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/ms_deform_attn.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/ms_deform_attn.cpp | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/tree/pytorch_1.0.0 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/knn.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/knn.cpp | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/pointops/src/knnquery_heap | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/iou3d.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/iou3d.cpp | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/iou3d_nms/src/iou3d_nms.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/info.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/info.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/info.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/group_points.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/group_points.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/group_points.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/fused_bias_leakyrelu.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/fused_bias_leakyrelu.cpp | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_bias_act.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/furthest_point_sample.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/furthest_point_sample.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/sampling.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/upfirdn2d_kernel.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/upfirdn2d_kernel.cu | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/upfirdn2d_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/three_nn_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/three_nn_cuda.cu | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/interpolate_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/three_interpolate_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/three_interpolate_cuda.cu | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/interpolate_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/rotated_feature_align_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/rotated_feature_align_cuda.cu | https://github.com/SJTU-Thinklab-Det/r3det-on-mmdetection/blob/master/mmdet/ops/fr/src/feature_refine_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/roipoint_pool3d_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/roipoint_pool3d_cuda.cu | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/roipoint_pool3d/src/roipoint_pool3d_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/roiaware_pool3d_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/roiaware_pool3d_cuda.cu | https://github.com/sshaoshuai/PCDet/blob/master/pcdet/ops/roiaware_pool3d/src/roiaware_pool3d_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/psamask_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/points_in_polygons_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/points_in_polygons_cuda.cu | https://github.com/ming71/CUDA/blob/master/point_justify/points_justify_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/points_in_boxes_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/points_in_boxes_cuda.cu | https://github.com/sshaoshuai/PCDet/blob/master/pcdet/ops/roiaware_pool3d/src/roiaware_pool3d_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/nms_rotated_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/ms_deform_attn_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/ms_deform_attn_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/tree/pytorch_1.0.0 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/min_area_polygons.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/min_area_polygons.cu | https://github.com/SDL-GuoZonghao/BeyondBoundingBox/blob/main/mmdet/ops/minareabbox/src/minareabbox_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/knn_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/knn_cuda.cu | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/pointops/src/knnquery_heap | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/iou3d_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/iou3d_cuda.cu | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/iou3d_nms/src/iou3d_nms_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/group_points_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/group_points_cuda.cu | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/group_points_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/fused_bias_leakyrelu_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/fused_bias_leakyrelu_cuda.cu | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_bias_act_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/furthest_point_sample_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/furthest_point_sample_cuda.cu | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/sampling_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/diff_iou_rotated_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/diff_iou_rotated_cuda.cu | https://github.com/lilanxiao/Rotated_IoU/cuda_op/sort_vert_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/correlation_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/correlation_cuda.cu | https://github.com/ClementPinard/Pytorch-Correlation-extension/blob/master/Correlation_Module/correlation_cuda_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/convex_iou.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/convex_iou.cu | https://github.com/SDL-GuoZonghao/BeyondBoundingBox/blob/main/mmdet/ops/iou/src/convex_iou_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/chamfer_distance_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/chamfer_distance_cuda.cu | https://github.com/chrdiller/pyTorchChamferDistance/blob/master/chamfer_distance/chamfer_distance.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/box_iou_rotated_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/ball_query_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/ball_query_cuda.cu | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/ball_query_gpu.cu| 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/assign_score_withk_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/assign_score_withk_cuda.cu | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/paconv_lib/src/gpu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cuda/active_rotated_filter_cuda.cu|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cuda/active_rotated_filter_cuda.cu | https://github.com/csuhan/s2anet/blob/master/mmdet/ops/orn/src/cuda/ActiveRotatingFilter_cuda.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/rotated_feature_align.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/rotated_feature_align.cpp | https://github.com/SJTU-Thinklab-Det/r3det-on-mmdetection/blob/master/mmdet/ops/fr/src/feature_refine_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/roi_align.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/roi_align.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/roi_align_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/roi_align_rotated.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlignRotated | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/psamask.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/psamask.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/pixel_group.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/pixel_group.cpp | https://github.com/WenmuZhou/PAN.pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/nms_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cpu.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/box_iou_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/cpu/active_rotated_filter.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/cpu/active_rotated_filter.cpp | https://github.com/csuhan/s2anet/blob/master/mmdet/ops/orn/src/cpu/ActiveRotatingFilter_cpu.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/convex_iou.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/convex_iou.cpp | https://github.com/SDL-GuoZonghao/BeyondBoundingBox/tree/main/mmdet/ops/iou/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/contour_expand.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/contour_expand.cpp | https://github.com/whai362/PSENet | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/convex_iou.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/convex_iou.cpp | https://github.com/SDL-GuoZonghao/BeyondBoundingBox/tree/main/mmdet/ops/iou/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/chamfer_distance.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/chamfer_distance.cpp | https://github.com/chrdiller/pyTorchChamferDistance/blob/master/chamfer_distance/chamfer_distance.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/box_iou_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/ball_query.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/ball_query.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/ball_query.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/assign_score_withk.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/assign_score_withk.cpp | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/paconv_lib/src/gpu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/pytorch/active_rotated_filter.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/pytorch/active_rotated_filter.cpp | https://github.com/csuhan/s2anet/blob/master/mmdet/ops/orn/src/ActiveRotatingFilter.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/upfirdn2d.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/upfirdn2d.cpp | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/upfirdn2d.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/three_nn.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/three_nn.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/interpolate.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/three_interpolate.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/three_interpolate.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/interpolate.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/rotated_feature_align.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/rotated_feature_align.cpp | https://github.com/SJTU-Thinklab-Det/r3det-on-mmdetection/blob/master/mmdet/ops/fr/src/feature_refine_cuda.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/roipoint_pool3d.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/roipoint_pool3d.cpp | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/roipoint_pool3d/src/roipoint_pool3d.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/psamask.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/psamask.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/pixel_group.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/pixel_group.cpp | https://github.com/WenmuZhou/PAN.pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/nms_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/ms_deform_attn.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/ms_deform_attn.cpp | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/tree/pytorch_1.0.0 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/knn.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/knn.cpp | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/pointops/src/knnquery_heap | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/iou3d.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/iou3d.cpp | https://github.com/open-mmlab/OpenPCDet/blob/master/pcdet/ops/iou3d_nms/src/iou3d_nms.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/info.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/info.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/info.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/group_points.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/group_points.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/group_points.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/fused_bias_leakyrelu.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/fused_bias_leakyrelu.cpp | https://github.com/rosinality/stylegan2-pytorch/blob/master/op/fused_bias_act.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/furthest_point_sample.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/furthest_point_sample.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/sampling.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/convex_iou.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/convex_iou.cpp | https://github.com/SDL-GuoZonghao/BeyondBoundingBox/tree/main/mmdet/ops/iou/src | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/contour_expand.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/contour_expand.cpp | https://github.com/whai362/PSENet | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/box_iou_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/ball_query.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/ball_query.cpp | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/ball_query.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/assign_score_withk.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/assign_score_withk.cpp | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/paconv_lib/src/gpu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/parrots/active_rotated_filter.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/parrots/active_rotated_filter.cpp | https://github.com/csuhan/s2anet/blob/master/mmdet/ops/orn/src/ActiveRotatingFilter.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/onnxruntime/cpu/rotated_feature_align.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/onnxruntime/cpu/rotated_feature_align.cpp | https://github.com/SJTU-Thinklab-Det/r3det-on-mmdetection/blob/master/mmdet/ops/fr/src/feature_refine_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/onnxruntime/cpu/roi_align_rotated.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/onnxruntime/cpu/roi_align_rotated.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlignRotated | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/onnxruntime/cpu/reduce_ops.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/onnxruntime/cpu/reduce_ops.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/native/ReduceOps.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/onnxruntime/cpu/reduce_ops.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/onnxruntime/cpu/reduce_ops.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/native/TensorDimApply.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/onnxruntime/cpu/reduce_ops.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/onnxruntime/cpu/reduce_ops.cpp | https://github.com/pytorch/pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/onnxruntime/cpu/gridSample.cpp|MMseg-swin/mmcv/mmcv/ops/csrc/onnxruntime/cpu/gridSample.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/native/GridSampler.cpp | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/pytorch_device_registry.hpp|MMseg-swin/mmcv/mmcv/ops/csrc/common/pytorch_device_registry.hpp | https://pytorch.org/tutorials/advanced/cpp_extension.html#writing-the-c-op | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/pytorch_device_registry.hpp|MMseg-swin/mmcv/mmcv/ops/csrc/common/pytorch_device_registry.hpp | https://github.com/pytorch/extension-cpp/issues/35 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/rotated_feature_align_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/rotated_feature_align_cuda_kernel.cuh | https://github.com/SJTU-Thinklab-Det/r3det-on-mmdetection/blob/master/mmdet/ops/fr/src/feature_refine_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/roi_align_rotated_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/roi_align_rotated_cuda_kernel.cuh | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlignRotated | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/riroi_align_rotated_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/riroi_align_rotated_cuda_kernel.cuh | https://github.com/csuhan/ReDet/blob/master/mmdet/ops/riroi_align/src/riroi_align_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/nms_rotated_cuda.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/nms_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/ms_deform_attn_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/ms_deform_attn_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/tree/pytorch_1.0.0 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/modulated_deform_conv_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/modulated_deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/modulated_deform_conv_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/modulated_deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/knn_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/knn_cuda_kernel.cuh | https://github.com/CVMI-Lab/PAConv/tree/main/scene_seg/lib/pointops/src/knnquery_heap | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/group_points_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/group_points_cuda_kernel.cuh | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/group_points_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/furthest_point_sample_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/furthest_point_sample_cuda_kernel.cuh | https://github.com/qiqihaer/3DSSD-pytorch/blob/master/lib/pointnet2/src/sampling_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/diff_iou_rotated_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/diff_iou_rotated_cuda_kernel.cuh | https://github.com/lilanxiao/Rotated_IoU/cuda_op/sort_vert_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/deform_conv_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/deform_conv_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/diff_iou_rotated_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh | https://github.com/ClementPinard/Pytorch-Correlation-extension/blob/master/Correlation_Module/correlation_cuda_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh | https://pytorch.org/tutorials/advanced/cpp_extension.html#writing-the-c-op | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh | https://github.com/pytorch/extension-cpp/issues/35 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/chamfer_distance_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/chamfer_distance_cuda_kernel.cuh | https://github.com/chrdiller/pyTorchChamferDistance/blob/master/chamfer_distance/chamfer_distance.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/correlation_cuda.cuh | https://pytorch.org/tutorials/advanced/cpp_extension.html#writing-the-c-op | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/carafe_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/carafe_cuda_kernel.cuh | https://devblogs.nvidia.com/efficient-matrix-transpose-cuda-cc/ | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/box_iou_rotated_cuda.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/box_iou_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/border_align_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/border_align_cuda_kernel.cuh | https://github.com/Megvii-BaseDetection/cvpods/blob/master/cvpods/layers/csrc/border_align/border_align_kernel.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/ball_query_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/ball_query_cuda_kernel.cuh | https://github.com/sshaoshuai/Pointnet2.PyTorch/tree/master/pointnet2/src/ball_query_gpu.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/cuda/active_rotated_filter_cuda_kernel.cuh|MMseg-swin/mmcv/mmcv/ops/csrc/common/cuda/active_rotated_filter_cuda_kernel.cuh | https://github.com/csuhan/s2anet/blob/master/mmdet/ops/orn/src/cuda/ActiveRotatingFilter_cuda.cu | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/csrc/common/box_iou_rotated_utils.hpp|MMseg-swin/mmcv/mmcv/ops/csrc/common/box_iou_rotated_utils.hpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/corner_pool.py|MMseg-swin/mmcv/mmcv/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/corner_pool.py|MMseg-swin/mmcv/mmcv/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/chamfer_distance.py|MMseg-swin/mmcv/mmcv/ops/chamfer_distance.py | https://arxiv.org/abs/2105.11111 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/cc_attention.py|MMseg-swin/mmcv/mmcv/ops/cc_attention.py | https://github.com/open-mmlab/mmcv/pull/1201 | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/carafe.py|MMseg-swin/mmcv/mmcv/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/border_align.py|MMseg-swin/mmcv/mmcv/ops/border_align.py | https://github.com/Megvii-BaseDetection/cvpods/blob/master/cvpods/layers/border_align.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/border_align.py|MMseg-swin/mmcv/mmcv/ops/border_align.py | https://arxiv.org/abs/2007.11056 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/assign_score_withk.py|MMseg-swin/mmcv/mmcv/ops/assign_score_withk.py | https://github.com/CVMI-Lab/PAConv/tree/main/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/assign_score_withk.py|MMseg-swin/mmcv/mmcv/ops/assign_score_withk.py | https://arxiv.org/pdf/2103.14635.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/assign_score_withk.py|MMseg-swin/mmcv/mmcv/ops/assign_score_withk.py | https://github.com/CVMI-Lab/PAConv/blob/main/scene_seg/model/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/ops/active_rotated_filter.py|MMseg-swin/mmcv/mmcv/ops/active_rotated_filter.py | https://arxiv.org/abs/2008.09397 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/onnx/symbolic.py|MMseg-swin/mmcv/mmcv/onnx/symbolic.py | https://github.com/pytorch/pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/onnx/onnx_utils/symbolic_helper.py|MMseg-swin/mmcv/mmcv/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/image/photometric.py|MMseg-swin/mmcv/mmcv/image/photometric.py | https://dl.acm.org/doi/pdf/10.1145/3065386 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/image/photometric.py|MMseg-swin/mmcv/mmcv/image/photometric.py | https://github.com/pytorch/vision/blob/main/torchvision/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/image/io.py|MMseg-swin/mmcv/mmcv/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 图片链接| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/image/colorspace.py|MMseg-swin/mmcv/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/image/colorspace.py|MMseg-swin/mmcv/mmcv/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/engine/test.py|MMseg-swin/mmcv/mmcv/engine/test.py | https://github.com/open-mmlab/mmcv/issues/985 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/device/ipu/model_wrapper.py|MMseg-swin/mmcv/mmcv/device/ipu/model_wrapper.py | https://docs.graphcore.ai/projects/poptorch-user-guide/en/latest/index.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/device/ipu/hook_wrapper.py|MMseg-swin/mmcv/mmcv/device/ipu/hook_wrapper.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/utils/weight_init.py|MMseg-swin/mmcv/mmcv/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/utils/weight_init.py|MMseg-swin/mmcv/mmcv/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/utils/weight_init.py|MMseg-swin/mmcv/mmcv/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/utils/weight_init.py|MMseg-swin/mmcv/mmcv/cnn/utils/weight_init.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/utils/weight_init.py|MMseg-swin/mmcv/mmcv/cnn/utils/weight_init.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/init.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/wrappers.py|MMseg-swin/mmcv/mmcv/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/transformer.py|MMseg-swin/mmcv/mmcv/cnn/bricks/transformer.py | https://pytorch.org/docs/stable/generated/torch.nn.Conv2d.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/transformer.py|MMseg-swin/mmcv/mmcv/cnn/bricks/transformer.py | https://arxiv.org/abs/2002.04745 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/plugin.py|MMseg-swin/mmcv/mmcv/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/non_local.py|MMseg-swin/mmcv/mmcv/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/non_local.py|MMseg-swin/mmcv/mmcv/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/generalized_attention.py|MMseg-swin/mmcv/mmcv/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/drop.py|MMseg-swin/mmcv/mmcv/cnn/bricks/drop.py | https://github.com/rwightman/pytorch-image-models/blob/a2727c1bf78ba0d7b5727f5f95e37fb7f8866b1f/timm/models/layers/drop.py | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/depthwise_separable_conv_module.py|MMseg-swin/mmcv/mmcv/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/conv_ws.py|MMseg-swin/mmcv/mmcv/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/conv_ws.py|MMseg-swin/mmcv/mmcv/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/mmcv/cnn/bricks/context_block.py|MMseg-swin/mmcv/mmcv/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/mmcv/examples/train.py|MMseg-swin/mmcv/examples/train.py | https://github.com/open-mmlab/mmcv/issues/1470 | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/zh_cn/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/zh_cn/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://mmsegmentation.readthedocs.io/zh-CN/latest/ | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/zh_cn/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/zh_cn/stat.py|MMseg-swin/docs/zh_cn/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/zh_cn/make.bat|MMseg-swin/docs/zh_cn/make.bat | http://sphinx-doc.org/ | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://mmsegmentation.readthedocs.io/zh-CN/latest/ | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/conf.py|MMseg-swin/docs/zh_cn/conf.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/stat.py|MMseg-swin/docs/zh_cn/stat.py | https://github.com/open-mmlab/mmsegmentation/blob/master/ | 源码实现| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/docs/en/make.bat|MMseg-swin/docs/zh_cn/make.bat | http://sphinx-doc.org/ | 相关说明| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k.py|MMseg-swin/configs/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py|MMseg-swin/configs/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py| https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py|MMseg-swin/configs/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py| https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k.py|MMseg-swin/configs/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py|MMseg-swin/configs/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k.py|MMseg-swin/configs/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py|MMseg-swin/configs/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k.py|MMseg-swin/configs/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k.py|MMseg-swin/configs/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k.py|MMseg-swin/configs/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py|MMseg-swin/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_20220317-e9b98025.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py|MMseg-swin/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_22k_20220317-4f79f7c0.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py|MMseg-swin/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_20220317-55b0104a.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py|MMseg-swin/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_22k_20220317-e5c09f74.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py|MMseg-swin/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_small_patch4_window7_224_20220317-7ba6d6dd.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py|MMseg-swin/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220317-1cdeb081.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes.py|MMseg-swin/configs/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc1_20220308-5368626c.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes.py|MMseg-swin/configs/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc2_20220308-7dbd9127.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k.py|MMseg-swin/configs/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_large_p16_384_20220308-d4efb41d.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k.py|MMseg-swin/configs/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_small_p16_384_20220308-410f6037.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k.py|MMseg-swin/configs/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_tiny_p16_384_20220308-cce8c795.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes.py|MMseg-swin/configs/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b0_512x512_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b0_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes.py|MMseg-swin/configs/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b1_512x512_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes.py|MMseg-swin/configs/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b2_512x512_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes.py|MMseg-swin/configs/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b3_512x512_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b3_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes.py|MMseg-swin/configs/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b4_512x512_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b4_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes.py|MMseg-swin/configs/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pthh | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b5_512x512_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b5_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pthh | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/segformer/segformer_mit-b5_640x640_160k_ade20k.py|MMseg-swin/configs/segformer/segformer_mit-b5_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pthh | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes.py|MMseg-swin/configs/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes.py|MMseg-swin/configs/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k.py|MMseg-swin/configs/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220308-f41b89d3.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k.py|MMseg-swin/configs/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k.py|MMseg-swin/configs/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k.py|MMseg-swin/configs/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_in21k_20220301-262fd037.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k.py|MMseg-swin/configs/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-large_3rdparty_in21k_20220301-e6e0ea0a.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k.py|MMseg-swin/configs/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py|MMseg-swin/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py|MMseg-swin/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k.py|MMseg-swin/configs/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-xlarge_3rdparty_in21k_20220301-08aa5ddc.pth | 预训练模型| -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/main/configs/_base_/datasets/isaid.pyy|MMseg-swin/configs/_base_/datasets/isaid.py | https://arxiv.org/pdf/2103.06564.pdf | 论文地址| -| 开发引入 | /|MMseg-swin/mmcv/requirements/docs.txt | https://github.com/open-mmlab/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖| -| 开发引入 | /|MMseg-swin/requirements/docs.txt | https://github.com/gaotongxiao/pytorch_sphinx_theme.git#egg=pytorch_sphinx_theme | 相关依赖| +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/_base_/models/segmenter_vit-b16_mask.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_base_p16_384_20220308-96dfe169.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/_base_/models/twins_pcpvt-s_fpn.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/_base_/models/twins_pcpvt-s_upernet.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_small_20220308-e638c41c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/_base_/models/upernet_convnext.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_32xb128-noema_in1k_20220301-2a0ee547.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://arxiv.org/abs/1908.07678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_40k_cityscapes/ann_r50-d8_512x1024_40k_cityscapes_20200605_095211-049fc292.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_40k_cityscapes/ann_r101-d8_512x1024_40k_cityscapes_20200605_095243-adf6eece.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_40k_cityscapes/ann_r50-d8_769x769_40k_cityscapes_20200530_025712-2b46b04d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_40k_cityscapes/ann_r101-d8_769x769_40k_cityscapes_20200530_025720-059bff28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x1024_80k_cityscapes/ann_r50-d8_512x1024_80k_cityscapes_20200607_101911-5a9ad545.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x1024_80k_cityscapes/ann_r101-d8_512x1024_80k_cityscapes_20200607_013728-aceccc6e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_769x769_80k_cityscapes/ann_r50-d8_769x769_80k_cityscapes_20200607_044426-cc7ff323.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_769x769_80k_cityscapes/ann_r101-d8_769x769_80k_cityscapes_20200607_013713-a9d4be8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_80k_ade20k/ann_r50-d8_512x512_80k_ade20k_20200615_014818-26f75e11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_80k_ade20k/ann_r101-d8_512x512_80k_ade20k_20200615_014818-c0153543.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_160k_ade20k/ann_r50-d8_512x512_160k_ade20k_20200615_231733-892247bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_160k_ade20k/ann_r101-d8_512x512_160k_ade20k_20200615_231733-955eb1ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_20k_voc12aug/ann_r50-d8_512x512_20k_voc12aug_20200617_222246-dfcb1c62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_20k_voc12aug/ann_r101-d8_512x512_20k_voc12aug_20200617_222246-2fad0042.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r50-d8_512x512_40k_voc12aug/ann_r50-d8_512x512_40k_voc12aug_20200613_231314-b5dac322.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ann/ann.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ann/ann_r101-d8_512x512_40k_voc12aug/ann_r101-d8_512x512_40k_voc12aug_20200613_231314-bd205bbe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://openaccess.thecvf.com/content_CVPR_2019/html/He_Adaptive_Pyramid_Context_Network_for_Semantic_Segmentation_CVPR_2019_paper.html | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_40k_cityscapes/apcnet_r50-d8_512x1024_40k_cityscapes_20201214_115717-5e88fa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_40k_cityscapes/apcnet_r101-d8_512x1024_40k_cityscapes_20201214_115716-abc9d111.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_40k_cityscapes/apcnet_r50-d8_769x769_40k_cityscapes_20201214_115717-2a2628d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_40k_cityscapes/apcnet_r101-d8_769x769_40k_cityscapes_20201214_115718-b650de90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x1024_80k_cityscapes/apcnet_r50-d8_512x1024_80k_cityscapes_20201214_115716-987f51e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x1024_80k_cityscapes/apcnet_r101-d8_512x1024_80k_cityscapes_20201214_115705-b1ff208a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_769x769_80k_cityscapes/apcnet_r50-d8_769x769_80k_cityscapes_20201214_115718-7ea9fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_769x769_80k_cityscapes/apcnet_r101-d8_769x769_80k_cityscapes_20201214_115716-a7fbc2ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_80k_ade20k/apcnet_r50-d8_512x512_80k_ade20k_20201214_115705-a8626293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_80k_ade20k/apcnet_r101-d8_512x512_80k_ade20k_20201214_115704-c656c3fb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r50-d8_512x512_160k_ade20k/apcnet_r50-d8_512x512_160k_ade20k_20201214_115706-25fb92c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/apcnet/apcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/apcnet/apcnet_r101-d8_512x512_160k_ade20k/apcnet_r101-d8_512x512_160k_ade20k_20201214_115705-73f9a8d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/beit/beit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/beit/upernet_beit-large_fp16_8x1_640x640_160k_ade20k/upernet_beit-large_fp16_8x1_640x640_160k_ade20k-8fc0dd5d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/beit/beit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/beit/upernet_beit-base_8x2_640x640_160k_ade20k/upernet_beit-base_8x2_640x640_160k_ade20k-eead221d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://arxiv.org/abs/1808.00897 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_4x4_1024x1024_160k_cityscapes_20210922_172239-c55e78e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210905_220251-8ba80eff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes/bisenetv1_r18-d32_in1k-pre_4x8_1024x1024_160k_cityscapes_20210905_220322-bb8db75f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_4x4_1024x1024_160k_cityscapes_20210923_222639-7b28a2a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes/bisenetv1_r50-d32_in1k-pre_4x4_1024x1024_160k_cityscapes_20210917_234628-8b304447.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211022_054328-046aa2f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r18-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211023_013100-f700dbf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_040616-d2bb0df4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r50-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_181932-66747911.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211102_164147-c6b32c3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv1/bisenetv1.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv1/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k/bisenetv1_r101-d32_in1k-pre_lr5e-3_4x4_512x512_160k_coco-stuff164k_20211101_225220-28c8f092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_ohem_4x4_1024x1024_160k_cityscapes_20210902_112947-5f8103b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_fp16_4x4_1024x1024_160k_cityscapes_20210902_045942-b979777b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_4x8_1024x1024_160k_cityscapes/bisenetv2_fcn_4x8_1024x1024_160k_cityscapes_20210903_000032-e1a2eed6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/bisenetv2/bisenetv2_fcn_4x4_1024x1024_160k_cityscapes/bisenetv2_fcn_4x4_1024x1024_160k_cityscapes_20210902_015551-bcf10f09.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/bisenetv2/bisenetv2.yml | https://arxiv.org/abs/2004.02147 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://arxiv.org/abs/1811.11721 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_40k_cityscapes/ccnet_r50-d8_512x1024_40k_cityscapes_20200616_142517-4123f401.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_40k_cityscapes/ccnet_r101-d8_512x1024_40k_cityscapes_20200616_142540-a3b84ba6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_40k_cityscapes/ccnet_r50-d8_769x769_40k_cityscapes_20200616_145125-76d11884.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_40k_cityscapes/ccnet_r101-d8_769x769_40k_cityscapes_20200617_101428-4f57c8d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x1024_80k_cityscapes/ccnet_r50-d8_512x1024_80k_cityscapes_20200617_010421-869a3423.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x1024_80k_cityscapes/ccnet_r101-d8_512x1024_80k_cityscapes_20200617_203935-ffae8917.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_769x769_80k_cityscapes/ccnet_r50-d8_769x769_80k_cityscapes_20200617_010421-73eed8ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_769x769_80k_cityscapes/ccnet_r101-d8_769x769_80k_cityscapes_20200618_011502-ad3cd481.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_80k_ade20k/ccnet_r50-d8_512x512_80k_ade20k_20200615_014848-aa37f61e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_80k_ade20k/ccnet_r101-d8_512x512_80k_ade20k_20200615_014848-1f4929a3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_160k_ade20k/ccnet_r50-d8_512x512_160k_ade20k_20200616_084435-7c97193b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_160k_ade20k/ccnet_r101-d8_512x512_160k_ade20k_20200616_000644-e849e007.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_20k_voc12aug/ccnet_r50-d8_512x512_20k_voc12aug_20200617_193212-fad81784.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_20k_voc12aug/ccnet_r101-d8_512x512_20k_voc12aug_20200617_193212-0007b61d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r50-d8_512x512_40k_voc12aug/ccnet_r50-d8_512x512_40k_voc12aug_20200613_232127-c2a15f02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ccnet/ccnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ccnet/ccnet_r101-d8_512x512_40k_voc12aug/ccnet_r101-d8_512x512_40k_voc12aug_20200613_232127-c30da577.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/cgnet/cgnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_680x680_60k_cityscapes/cgnet_680x680_60k_cityscapes_20201101_110253-4c0b2f2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/cgnet/cgnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/cgnet/cgnet_512x1024_60k_cityscapes/cgnet_512x1024_60k_cityscapes_20201101_110254-124ea03b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/cgnet/cgnet.yml | https://arxiv.org/abs/1811.08201 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k/upernet_convnext_xlarge_fp16_640x640_160k_ade20k_20220226_080344-95fc38c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k/upernet_convnext_tiny_fp16_512x512_160k_ade20k_20220227_124553-cad485de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k/upernet_convnext_small_fp16_512x512_160k_ade20k_20220227_131208-1b1e394f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k/upernet_convnext_large_fp16_640x640_160k_ade20k_20220226_040532-e57aa54d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k/upernet_convnext_base_fp16_640x640_160k_ade20k_20220227_182859-9280e39b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/convnext.yml | https://download.openmmlab.com/mmsegmentation/v0.5/convnext/upernet_convnext_base_fp16_512x512_160k_ade20k/upernet_convnext_base_fp16_512x512_160k_ade20k_20220227_181227-02a24fc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/upernet_convnext_base_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-base_3rdparty_in21k_20220301-262fd037.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/upernet_convnext_large_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-large_3rdparty_in21k_20220301-e6e0ea0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/upernet_convnext_small_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/upernet_convnext_tiny_fp16_512x512_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/convnext/upernet_convnext_xlarge_fp16_640x640_160k_ade20k.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-xlarge_3rdparty_in21k_20220301-08aa5ddc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://arxiv.org/abs/1809.02983 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_40k_cityscapes/danet_r50-d8_512x1024_40k_cityscapes_20200605_191324-c0dbfa5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_40k_cityscapes/danet_r101-d8_512x1024_40k_cityscapes_20200605_200831-c57a7157.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_40k_cityscapes/danet_r50-d8_769x769_40k_cityscapes_20200530_025703-76681c60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_40k_cityscapes/danet_r101-d8_769x769_40k_cityscapes_20200530_025717-dcb7fd4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x1024_80k_cityscapes/danet_r50-d8_512x1024_80k_cityscapes_20200607_133029-2bfa2293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x1024_80k_cityscapes/danet_r101-d8_512x1024_80k_cityscapes_20200607_132918-955e6350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_769x769_80k_cityscapes/danet_r50-d8_769x769_80k_cityscapes_20200607_132954-495689b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_769x769_80k_cityscapes/danet_r101-d8_769x769_80k_cityscapes_20200607_132918-f3a929e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_80k_ade20k/danet_r50-d8_512x512_80k_ade20k_20200615_015125-edb18e08.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_80k_ade20k/danet_r101-d8_512x512_80k_ade20k_20200615_015126-d0357c73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_160k_ade20k/danet_r50-d8_512x512_160k_ade20k_20200616_082340-9cb35dcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_160k_ade20k/danet_r101-d8_512x512_160k_ade20k_20200616_082348-23bf12f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_20k_voc12aug/danet_r50-d8_512x512_20k_voc12aug_20200618_070026-9e9e3ab3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_20k_voc12aug/danet_r101-d8_512x512_20k_voc12aug_20200618_070026-d48d23b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r50-d8_512x512_40k_voc12aug/danet_r50-d8_512x512_40k_voc12aug_20200613_235526-426e3a64.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/danet/danet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/danet/danet_r101-d8_512x512_40k_voc12aug/danet_r101-d8_512x512_40k_voc12aug_20200613_223031-788e232a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://arxiv.org/abs/1706.05587 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_40k_cityscapes/deeplabv3_r50-d8_512x1024_40k_cityscapes_20200605_022449-acadc2f8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_40k_cityscapes/deeplabv3_r101-d8_512x1024_40k_cityscapes_20200605_012241-7fd3f799.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_40k_cityscapes/deeplabv3_r50-d8_769x769_40k_cityscapes_20200606_113723-7eda553c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_40k_cityscapes/deeplabv3_r101-d8_769x769_40k_cityscapes_20200606_113809-c64f889f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_512x1024_80k_cityscapes/deeplabv3_r18-d8_512x1024_80k_cityscapes_20201225_021506-23dffbe2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x1024_80k_cityscapes/deeplabv3_r50-d8_512x1024_80k_cityscapes_20200606_113404-b92cfdd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x1024_80k_cityscapes/deeplabv3_r101-d8_512x1024_80k_cityscapes_20200606_113503-9e428899.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-774d9cec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18-d8_769x769_80k_cityscapes/deeplabv3_r18-d8_769x769_80k_cityscapes_20201225_021506-6452126a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_769x769_80k_cityscapes/deeplabv3_r50-d8_769x769_80k_cityscapes_20200606_221338-788d6228.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_769x769_80k_cityscapes/deeplabv3_r101-d8_769x769_80k_cityscapes_20200607_013353-60e95418.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-57bb8425.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_512x1024_80k_cityscapes/deeplabv3_r18b-d8_512x1024_80k_cityscapes_20201225_094144-46040cef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_512x1024_80k_cityscapes/deeplabv3_r50b-d8_512x1024_80k_cityscapes_20201225_155148-ec368954.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_512x1024_80k_cityscapes/deeplabv3_r101b-d8_512x1024_80k_cityscapes_20201226_171821-8fd49503.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r18b-d8_769x769_80k_cityscapes/deeplabv3_r18b-d8_769x769_80k_cityscapes_20201225_094144-fdc985d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50b-d8_769x769_80k_cityscapes/deeplabv3_r50b-d8_769x769_80k_cityscapes_20201225_155404-87fb0cf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101b-d8_769x769_80k_cityscapes/deeplabv3_r101b-d8_769x769_80k_cityscapes_20201226_190843-9142ee57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_80k_ade20k/deeplabv3_r50-d8_512x512_80k_ade20k_20200614_185028-0bb3f844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_80k_ade20k/deeplabv3_r101-d8_512x512_80k_ade20k_20200615_021256-d89c7fa4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_160k_ade20k/deeplabv3_r50-d8_512x512_160k_ade20k_20200615_123227-5d0ee427.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_160k_ade20k/deeplabv3_r101-d8_512x512_160k_ade20k_20200615_105816-b1f72b3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_20k_voc12aug/deeplabv3_r50-d8_512x512_20k_voc12aug_20200617_010906-596905ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_20k_voc12aug/deeplabv3_r101-d8_512x512_20k_voc12aug_20200617_010932-8d13832f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_40k_voc12aug/deeplabv3_r50-d8_512x512_40k_voc12aug_20200613_161546-2ae96e7e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_40k_voc12aug/deeplabv3_r101-d8_512x512_40k_voc12aug_20200613_161432-0017d784.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context/deeplabv3_r101-d8_480x480_40k_pascal_context_20200911_204118-1aa27336.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context/deeplabv3_r101-d8_480x480_80k_pascal_context_20200911_170155-2a21fff3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_40k_pascal_context_59/deeplabv3_r101-d8_480x480_40k_pascal_context_59_20210416_110332-cb08ea46.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_480x480_80k_pascal_context_59/deeplabv3_r101-d8_480x480_80k_pascal_context_59_20210416_113002-26303993.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-b35f789d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_20k_coco-stuff10k_20210821_043025-c49752cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-dc76f3ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k/deeplabv3_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_043305-636cb433.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_80k_coco-stuff164k_20210709_163016-88675c24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_80k_coco-stuff164k_20210709_201252-13600dc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_160k_coco-stuff164k_20210709_163016-49f2812b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_160k_coco-stuff164k_20210709_155402-f035acfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r50-d8_512x512_4x4_320k_coco-stuff164k_20210709_155403-51b21115.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3/deeplabv3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k/deeplabv3_r101-d8_512x512_4x4_320k_coco-stuff164k_20210709_155402-3cbca14d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_40k_cityscapes/deeplabv3plus_r50-d8_512x1024_40k_cityscapes_20200605_094610-d222ffcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_40k_cityscapes/deeplabv3plus_r101-d8_512x1024_40k_cityscapes_20200605_094614-3769eecf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_40k_cityscapes/deeplabv3plus_r50-d8_769x769_40k_cityscapes_20200606_114143-1dcb0e3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_40k_cityscapes/deeplabv3plus_r101-d8_769x769_40k_cityscapes_20200606_114304-ff414b9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x1024_80k_cityscapes/deeplabv3plus_r18-d8_512x1024_80k_cityscapes_20201226_080942-cff257fe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x1024_80k_cityscapes/deeplabv3plus_r50-d8_512x1024_80k_cityscapes_20200606_114049-f9fb496d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_512x1024_80k_cityscapes_20200606_114143-068fcfe9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_769x769_80k_cityscapes/deeplabv3plus_r18-d8_769x769_80k_cityscapes_20201226_083346-f326e06a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_769x769_80k_cityscapes/deeplabv3plus_r50-d8_769x769_80k_cityscapes_20200606_210233-0e9dfdc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_40k_cityscapes_20200908_005644-cf9ce186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes/deeplabv3plus_r101-d16-mg124_512x1024_80k_cityscapes_20200908_005644-ee6158e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes/deeplabv3plus_r18b-d8_512x1024_80k_cityscapes_20201226_090828-e451abd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes/deeplabv3plus_r50b-d8_512x1024_80k_cityscapes_20201225_213645-a97e4e43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes/deeplabv3plus_r101b-d8_512x1024_80k_cityscapes_20201226_190843-9c3c93a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18b-d8_769x769_80k_cityscapes/deeplabv3plus_r18b-d8_769x769_80k_cityscapes_20201226_151312-2c868aff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50b-d8_769x769_80k_cityscapes/deeplabv3plus_r50b-d8_769x769_80k_cityscapes_20201225_224655-8b596d1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101b-d8_769x769_80k_cityscapes/deeplabv3plus_r101b-d8_769x769_80k_cityscapes_20201226_205041-227cdf7c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_ade20k/deeplabv3plus_r50-d8_512x512_80k_ade20k_20200614_185028-bf1400d8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_ade20k/deeplabv3plus_r101-d8_512x512_80k_ade20k_20200615_014139-d5730af7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_160k_ade20k/deeplabv3plus_r50-d8_512x512_160k_ade20k_20200615_124504-6135c7e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_160k_ade20k/deeplabv3plus_r101-d8_512x512_160k_ade20k_20200615_123232-38ed86bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_20k_voc12aug/deeplabv3plus_r50-d8_512x512_20k_voc12aug_20200617_102323-aad58ef1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_20k_voc12aug/deeplabv3plus_r101-d8_512x512_20k_voc12aug_20200617_102345-c7ff3d56.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_40k_voc12aug/deeplabv3plus_r50-d8_512x512_40k_voc12aug_20200613_161759-e1b43aa9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_40k_voc12aug/deeplabv3plus_r101-d8_512x512_40k_voc12aug_20200613_205333-faf03387.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context/deeplabv3plus_r101-d8_480x480_40k_pascal_context_20200911_165459-d3c8a29e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context/deeplabv3plus_r101-d8_480x480_80k_pascal_context_20200911_155322-145d3ee8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59/deeplabv3plus_r101-d8_480x480_40k_pascal_context_59_20210416_111233-ed937f15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59/deeplabv3plus_r101-d8_480x480_80k_pascal_context_59_20210416_111127-7ca0331d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes/deeplabv3plus_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230920-f1104f4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_769x769_80k_cityscapes/deeplabv3plus_r101-d8_769x769_80k_cityscapes_20220406_154720-dfcc0b68.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x512_80k_loveda/deeplabv3plus_r18-d8_512x512_80k_loveda_20211104_132800-ce0fa0ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_loveda/deeplabv3plus_r50-d8_512x512_80k_loveda_20211105_080442-f0720392.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_loveda/deeplabv3plus_r101-d8_512x512_80k_loveda_20211105_110759-4c1f297e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_512x512_80k_potsdam/deeplabv3plus_r18-d8_512x512_80k_potsdam_20211219_020601-75fd5bc3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_512x512_80k_potsdam/deeplabv3plus_r50-d8_512x512_80k_potsdam_20211219_031508-7e7a2b24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_512x512_80k_potsdam/deeplabv3plus_r101-d8_512x512_80k_potsdam_20211219_031508-8b112708.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r18-d8_4x4_512x512_80k_vaihingen_20211231_230805-7626a263.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r50-d8_4x4_512x512_80k_vaihingen_20211231_230816-5040938d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen/deeplabv3plus_r101-d8_4x4_512x512_80k_vaihingen_20211231_230816-8a095afa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r18-d8_4x4_896x896_80k_isaid/deeplabv3plus_r18-d8_4x4_896x896_80k_isaid_20220110_180526-7059991d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://download.openmmlab.com/mmsegmentation/v0.5/deeplabv3plus/deeplabv3plus_r50-d8_4x4_896x896_80k_isaid/deeplabv3plus_r50-d8_4x4_896x896_80k_isaid_20220110_180526-598be439.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/deeplabv3plus/deeplabv3plus.yml | https://arxiv.org/abs/1802.02611 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_80k_cityscapes/dmnet_r50-d8_769x769_80k_cityscapes_20201215_034006-6060840e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_769x769_40k_cityscapes/dmnet_r50-d8_769x769_40k_cityscapes_20201215_093706-e7f0e23e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_80k_ade20k/dmnet_r50-d8_512x512_80k_ade20k_20201215_144744-f89092a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x512_160k_ade20k/dmnet_r50-d8_512x512_160k_ade20k_20201215_115313-025ab3f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_80k_cityscapes/dmnet_r50-d8_512x1024_80k_cityscapes_20201215_053728-3c8893b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r50-d8_512x1024_40k_cityscapes/dmnet_r50-d8_512x1024_40k_cityscapes_20201215_042326-615373cf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_80k_cityscapes/dmnet_r101-d8_769x769_80k_cityscapes_20201215_082810-7f0de59a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_769x769_40k_cityscapes/dmnet_r101-d8_769x769_40k_cityscapes_20201215_081348-a74261f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_80k_ade20k/dmnet_r101-d8_512x512_80k_ade20k_20201215_104812-bfa45311.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x512_160k_ade20k/dmnet_r101-d8_512x512_160k_ade20k_20201215_111145-a0bc02ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_80k_cityscapes/dmnet_r101-d8_512x1024_80k_cityscapes_20201215_031718-fa081cb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dmnet/dmnet_r101-d8_512x1024_40k_cityscapes/dmnet_r101-d8_512x1024_40k_cityscapes_20201215_043100-8291e976.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dmnet/dmnet.yml | https://openaccess.thecvf.com/content_ICCV_2019/papers/He_Dynamic_Multi-Scale_Filters_for_Semantic_Segmentation_ICCV_2019_paper.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://arxiv.org/abs/2006.06668 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_40k_cityscapes/dnl_r50-d8_512x1024_40k_cityscapes_20200904_233629-53d4ea93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_40k_cityscapes/dnl_r101-d8_512x1024_40k_cityscapes_20200904_233629-9928ffef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_40k_cityscapes/dnl_r50-d8_769x769_40k_cityscapes_20200820_232206-0f283785.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_40k_cityscapes/dnl_r101-d8_769x769_40k_cityscapes_20200820_171256-76c596df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x1024_80k_cityscapes/dnl_r50-d8_512x1024_80k_cityscapes_20200904_233629-58b2f778.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x1024_80k_cityscapes/dnl_r101-d8_512x1024_80k_cityscapes_20200904_233629-758e2dd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_769x769_80k_cityscapes/dnl_r50-d8_769x769_80k_cityscapes_20200820_011925-366bc4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_769x769_80k_cityscapes/dnl_r101-d8_769x769_80k_cityscapes_20200821_051111-95ff84ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_80k_ade20k/dnl_r50-d8_512x512_80k_ade20k_20200826_183354-1cf6e0c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_80k_ade20k/dnl_r101-d8_512x512_80k_ade20k_20200826_183354-d820d6ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r50-d8_512x512_160k_ade20k/dnl_r50-d8_512x512_160k_ade20k_20200826_183350-37837798.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dnlnet/dnlnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dnlnet/dnl_r101-d8_512x512_160k_ade20k/dnl_r101-d8_512x512_160k_ade20k_20200826_183350-ed522c61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dpt/dpt.yml | https://download.openmmlab.com/mmsegmentation/v0.5/dpt/dpt_vit-b16_512x512_160k_ade20k/dpt_vit-b16_512x512_160k_ade20k-db31cf52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/dpt/dpt.yml | https://arxiv.org/abs/2103.13413 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_512x1024_80k_cityscapes/emanet_r50-d8_512x1024_80k_cityscapes_20200901_100301-c43fcef1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_512x1024_80k_cityscapes/emanet_r101-d8_512x1024_80k_cityscapes_20200901_100301-2d970745.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r50-d8_769x769_80k_cityscapes/emanet_r50-d8_769x769_80k_cityscapes_20200901_100301-16f8de52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/emanet/emanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/emanet/emanet_r101-d8_769x769_80k_cityscapes/emanet_r101-d8_769x769_80k_cityscapes_20200901_100301-47a324ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/emanet/emanet.yml | https://arxiv.org/abs/1907.13426 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_40k_cityscapes/encnet_r50-d8_512x1024_40k_cityscapes_20200621_220958-68638a47.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_40k_cityscapes/encnet_r101-d8_512x1024_40k_cityscapes_20200621_220933-35e0a3e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_40k_cityscapes/encnet_r50-d8_769x769_40k_cityscapes_20200621_220958-3bcd2884.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_40k_cityscapes/encnet_r101-d8_769x769_40k_cityscapes_20200621_220933-2fafed55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x1024_80k_cityscapes/encnet_r50-d8_512x1024_80k_cityscapes_20200622_003554-fc5c5624.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x1024_80k_cityscapes/encnet_r101-d8_512x1024_80k_cityscapes_20200622_003555-1de64bec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_769x769_80k_cityscapes/encnet_r50-d8_769x769_80k_cityscapes_20200622_003554-55096dcb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_769x769_80k_cityscapes/encnet_r101-d8_769x769_80k_cityscapes_20200622_003555-470ef79d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_80k_ade20k/encnet_r50-d8_512x512_80k_ade20k_20200622_042412-44b46b04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_80k_ade20k/encnet_r101-d8_512x512_80k_ade20k_20200622_101128-dd35e237.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r50-d8_512x512_160k_ade20k/encnet_r50-d8_512x512_160k_ade20k_20200622_101059-b2db95e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/encnet/encnet_r101-d8_512x512_160k_ade20k/encnet_r101-d8_512x512_160k_ade20k_20200622_073348-7989641f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/encnet/encnet.yml | https://arxiv.org/abs/1803.08904 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/erfnet/erfnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/erfnet/erfnet_fcn_4x4_512x1024_160k_cityscapes/erfnet_fcn_4x4_512x1024_160k_cityscapes_20211126_082056-03d333ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k/fastfcn_r50-d32_jpu_psp_512x512_80k_ade20k_20210930_225137-993d07c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k/fastfcn_r50-d32_jpu_psp_512x512_160k_ade20k_20211008_105455-e8f5a2fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_psp_512x1024_80k_cityscapes_20210928_053722-57749bed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_psp_4x4_512x1024_80k_cityscapes_20210925_061841-77e87b0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k/fastfcn_r50-d32_jpu_enc_512x512_80k_ade20k_20210930_225214-65aef6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k/fastfcn_r50-d32_jpu_enc_512x512_160k_ade20k_20211008_105456-d875ce3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_enc_512x1024_80k_cityscapes_20210928_030036-78da5046.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_enc_4x4_512x1024_80k_cityscapes_20210926_093217-e1eb6dbb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k/fastfcn_r50-d32_jpu_aspp_512x512_80k_ade20k_20211013_190619-3aa40f2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k/fastfcn_r50-d32_jpu_aspp_512x512_160k_ade20k_20211008_152246-27036aee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_aspp_512x1024_80k_cityscapes_20210928_053722-5d1a2648.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fastfcn/fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes/fastfcn_r50-d32_jpu_aspp_4x4_512x1024_80k_cityscapes_20210924_214357-72220849.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastfcn/fastfcn.yml | https://arxiv.org/abs/1903.11816 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastscnn/fastscnn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fast_scnn/fast_scnn_lr0.12_8x4_160k_cityscapes/fast_scnn_lr0.12_8x4_160k_cityscapes_20210630_164853-0cec9937.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fastscnn/fastscnn.yml | https://arxiv.org/abs/1902.04502 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_fp16_512x1024_80k_cityscapes/fcn_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230921-fb13e883.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_40k_cityscapes/fcn_r50-d8_512x1024_40k_cityscapes_20200604_192608-efe53f0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_40k_cityscapes/fcn_r101-d8_512x1024_40k_cityscapes_20200604_181852-a883d3a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_40k_cityscapes/fcn_r50-d8_769x769_40k_cityscapes_20200606_113104-977b5d02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_40k_cityscapes/fcn_r101-d8_769x769_40k_cityscapes_20200606_113208-7d4ab69c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_512x1024_80k_cityscapes/fcn_r18-d8_512x1024_80k_cityscapes_20201225_021327-6c50f8b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x1024_80k_cityscapes/fcn_r50-d8_512x1024_80k_cityscapes_20200606_113019-03aa804d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x1024_80k_cityscapes/fcn_r101-d8_512x1024_80k_cityscapes_20200606_113038-3fb937eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18-d8_769x769_80k_cityscapes/fcn_r18-d8_769x769_80k_cityscapes_20201225_021451-9739d1b8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_769x769_80k_cityscapes/fcn_r50-d8_769x769_80k_cityscapes_20200606_195749-f5caeabc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_769x769_80k_cityscapes/fcn_r101-d8_769x769_80k_cityscapes_20200606_214354-45cbac68.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_512x1024_80k_cityscapes/fcn_r18b-d8_512x1024_80k_cityscapes_20201225_230143-92c0f445.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_512x1024_80k_cityscapes/fcn_r50b-d8_512x1024_80k_cityscapes_20201225_094221-82957416.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_512x1024_80k_cityscapes/fcn_r101b-d8_512x1024_80k_cityscapes_20201226_160213-4543858f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r18b-d8_769x769_80k_cityscapes/fcn_r18b-d8_769x769_80k_cityscapes_20201226_004430-32d504e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50b-d8_769x769_80k_cityscapes/fcn_r50b-d8_769x769_80k_cityscapes_20201225_094223-94552d38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101b-d8_769x769_80k_cityscapes/fcn_r101b-d8_769x769_80k_cityscapes_20201226_170012-82be37e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_40k_cityscapes/fcn_d6_r50-d16_512x1024_40k_cityscapes_20210305_130133-98d5d1bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_512x1024_80k_cityscapes/fcn_d6_r50-d16_512x1024_80k_cityscapes_20210306_115604-133c292f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_40k_cityscapes/fcn_d6_r50-d16_769x769_40k_cityscapes_20210305_185744-1aab18ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50-d16_769x769_80k_cityscapes/fcn_d6_r50-d16_769x769_80k_cityscapes_20210305_200413-109d88eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_40k_cityscapes/fcn_d6_r101-d16_512x1024_40k_cityscapes_20210305_130337-9cf2b450.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_512x1024_80k_cityscapes/fcn_d6_r101-d16_512x1024_80k_cityscapes_20210308_102747-cb336445.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_40k_cityscapes/fcn_d6_r101-d16_769x769_40k_cityscapes_20210308_102453-60b114e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101-d16_769x769_80k_cityscapes/fcn_d6_r101-d16_769x769_80k_cityscapes_20210306_120016-e33adc4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50b-d16_512x1024_80k_cityscapes/fcn_d6_r50b-d16_512x1024_80k_cityscapes_20210311_125550-6a0b62e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r50b-d16_769x769_80k_cityscapes/fcn_d6_r50b-d16_769x769_80k_cityscapes_20210311_131012-d665f231.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101b-d16_512x1024_80k_cityscapes/fcn_d6_r101b-d16_512x1024_80k_cityscapes_20210311_144305-3f2eb5b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_d6_r101b-d16_769x769_80k_cityscapes/fcn_d6_r101b-d16_769x769_80k_cityscapes_20210311_154527-c4d8bfbc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_80k_ade20k/fcn_r50-d8_512x512_80k_ade20k_20200614_144016-f8ac5082.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_80k_ade20k/fcn_r101-d8_512x512_80k_ade20k_20200615_014143-bc1809f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_160k_ade20k/fcn_r50-d8_512x512_160k_ade20k_20200615_100713-4edbc3b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_160k_ade20k/fcn_r101-d8_512x512_160k_ade20k_20200615_105816-fd192bd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_20k_voc12aug/fcn_r50-d8_512x512_20k_voc12aug_20200617_010715-52dc5306.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_20k_voc12aug/fcn_r101-d8_512x512_20k_voc12aug_20200617_010842-0bb4e798.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r50-d8_512x512_40k_voc12aug/fcn_r50-d8_512x512_40k_voc12aug_20200613_161222-5e2dbf40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_512x512_40k_voc12aug/fcn_r101-d8_512x512_40k_voc12aug_20200613_161240-4c8bcefd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context/fcn_r101-d8_480x480_40k_pascal_context-20210421_154757-b5e97937.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context/fcn_r101-d8_480x480_80k_pascal_context-20210421_163310-4711813f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_40k_pascal_context_59/fcn_r101-d8_480x480_40k_pascal_context_59_20210415_230724-8cf83682.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/fcn/fcn_r101-d8_480x480_80k_pascal_context_59/fcn_r101-d8_480x480_80k_pascal_context_59_20210416_110804-9a6f2c94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/fcn/fcn.yml | https://arxiv.org/abs/1411.4038 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_40k_cityscapes/gcnet_r50-d8_512x1024_40k_cityscapes_20200618_074436-4b0fd17b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_40k_cityscapes/gcnet_r101-d8_512x1024_40k_cityscapes_20200618_074436-5e62567f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_40k_cityscapes/gcnet_r50-d8_769x769_40k_cityscapes_20200618_182814-a26f4471.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_40k_cityscapes/gcnet_r101-d8_769x769_40k_cityscapes_20200619_092550-ca4f0a84.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes/gcnet_r50-d8_512x1024_80k_cityscapes_20200618_074450-ef8f069b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_80k_cityscapes/gcnet_r101-d8_512x1024_80k_cityscapes_20200618_074450-778ebf69.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_80k_cityscapes/gcnet_r50-d8_769x769_80k_cityscapes_20200619_092516-4839565b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_80k_cityscapes/gcnet_r101-d8_769x769_80k_cityscapes_20200619_092628-8e043423.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_80k_ade20k/gcnet_r50-d8_512x512_80k_ade20k_20200614_185146-91a6da41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_80k_ade20k/gcnet_r101-d8_512x512_80k_ade20k_20200615_020811-c3fcb6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_160k_ade20k/gcnet_r50-d8_512x512_160k_ade20k_20200615_224122-d95f3e1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_160k_ade20k/gcnet_r101-d8_512x512_160k_ade20k_20200615_225406-615528d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_20k_voc12aug/gcnet_r50-d8_512x512_20k_voc12aug_20200617_165701-3cbfdab1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_20k_voc12aug/gcnet_r101-d8_512x512_20k_voc12aug_20200617_165713-6c720aa9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_40k_voc12aug/gcnet_r50-d8_512x512_40k_voc12aug_20200613_195105-9797336d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_40k_voc12aug/gcnet_r101-d8_512x512_40k_voc12aug_20200613_185806-1e38208d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_40k_cityscapes/gcnet_r50-d8_512x1024_40k_cityscapes_20200618_074436-4b0fd17b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_40k_cityscapes/gcnet_r101-d8_512x1024_40k_cityscapes_20200618_074436-5e62567f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_40k_cityscapes/gcnet_r50-d8_769x769_40k_cityscapes_20200618_182814-a26f4471.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_40k_cityscapes/gcnet_r101-d8_769x769_40k_cityscapes_20200619_092550-ca4f0a84.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x1024_80k_cityscapes/gcnet_r50-d8_512x1024_80k_cityscapes_20200618_074450-ef8f069b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x1024_80k_cityscapes/gcnet_r101-d8_512x1024_80k_cityscapes_20200618_074450-778ebf69.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_769x769_80k_cityscapes/gcnet_r50-d8_769x769_80k_cityscapes_20200619_092516-4839565b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_769x769_80k_cityscapes/gcnet_r101-d8_769x769_80k_cityscapes_20200619_092628-8e043423.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_80k_ade20k/gcnet_r50-d8_512x512_80k_ade20k_20200614_185146-91a6da41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_80k_ade20k/gcnet_r101-d8_512x512_80k_ade20k_20200615_020811-c3fcb6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_160k_ade20k/gcnet_r50-d8_512x512_160k_ade20k_20200615_224122-d95f3e1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_160k_ade20k/gcnet_r101-d8_512x512_160k_ade20k_20200615_225406-615528d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_20k_voc12aug/gcnet_r50-d8_512x512_20k_voc12aug_20200617_165701-3cbfdab1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_20k_voc12aug/gcnet_r101-d8_512x512_20k_voc12aug_20200617_165713-6c720aa9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r50-d8_512x512_40k_voc12aug/gcnet_r50-d8_512x512_40k_voc12aug_20200613_195105-9797336d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/gcnet/gcnet_r101-d8_512x512_40k_voc12aug/gcnet_r101-d8_512x512_40k_voc12aug_20200613_185806-1e38208d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/gcnet/gcnet.yml | https://arxiv.org/abs/1904.11492 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_40k_cityscapes/fcn_hr18s_512x1024_40k_cityscapes_20200601_014216-93db27d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_40k_cityscapes/fcn_hr18_512x1024_40k_cityscapes_20200601_014216-f196fb4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_40k_cityscapes/fcn_hr48_512x1024_40k_cityscapes_20200601_014240-a989b146.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_80k_cityscapes/fcn_hr18s_512x1024_80k_cityscapes_20200601_202700-1462b75d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_80k_cityscapes/fcn_hr18_512x1024_80k_cityscapes_20200601_223255-4e7b345e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_80k_cityscapes/fcn_hr48_512x1024_80k_cityscapes_20200601_202606-58ea95d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x1024_160k_cityscapes/fcn_hr18s_512x1024_160k_cityscapes_20200602_190901-4a0797ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x1024_160k_cityscapes/fcn_hr18_512x1024_160k_cityscapes_20200602_190822-221e4a4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x1024_160k_cityscapes/fcn_hr48_512x1024_160k_cityscapes_20200602_190946-59b7973e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_ade20k/fcn_hr18s_512x512_80k_ade20k_20200614_144345-77fc814a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_ade20k/fcn_hr18_512x512_80k_ade20k_20210827_114910-6c9382c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_ade20k/fcn_hr48_512x512_80k_ade20k_20200614_193946-7ba5258d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_160k_ade20k/fcn_hr18s_512x512_160k_ade20k_20210829_174739-f1e7c2e7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_160k_ade20k/fcn_hr18_512x512_160k_ade20k_20200614_214426-ca961836.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_160k_ade20k/fcn_hr48_512x512_160k_ade20k_20200614_214407-a52fc02c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_20k_voc12aug/fcn_hr18s_512x512_20k_voc12aug_20210829_174910-0aceadb4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_20k_voc12aug/fcn_hr18_512x512_20k_voc12aug_20200617_224503-488d45f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_20k_voc12aug/fcn_hr48_512x512_20k_voc12aug_20200617_224419-89de05cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_40k_voc12aug/fcn_hr18s_512x512_40k_voc12aug_20200614_000648-4f8d6e7f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_40k_voc12aug/fcn_hr18_512x512_40k_voc12aug_20200613_224401-1b4b76cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_40k_voc12aug/fcn_hr48_512x512_40k_voc12aug_20200613_222111-1b0f18bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context/fcn_hr48_480x480_40k_pascal_context_20200911_164852-667d00b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context/fcn_hr48_480x480_80k_pascal_context_20200911_155322-847a6711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_40k_pascal_context_59/fcn_hr48_480x480_40k_pascal_context_59_20210410_122738-b808b8b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_480x480_80k_pascal_context_59/fcn_hr48_480x480_80k_pascal_context_59_20210411_003240-3ae7081e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_loveda/fcn_hr18s_512x512_80k_loveda_20211210_203228-60a86a7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_loveda/fcn_hr18_512x512_80k_loveda_20211210_203952-93d9c3b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_loveda/fcn_hr48_512x512_80k_loveda_20211211_044756-67072f55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_512x512_80k_potsdam/fcn_hr18s_512x512_80k_potsdam_20211218_205517-ba32af63.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_512x512_80k_potsdam/fcn_hr18_512x512_80k_potsdam_20211218_205517-5d0387ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_512x512_80k_potsdam/fcn_hr48_512x512_80k_potsdam_20211219_020601-97434c78.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_4x4_512x512_80k_vaihingen/fcn_hr18s_4x4_512x512_80k_vaihingen_20211231_230909-b23aae02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_4x4_512x512_80k_vaihingen/fcn_hr18_4x4_512x512_80k_vaihingen_20211231_231216-2ec3ae8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_4x4_512x512_80k_vaihingen/fcn_hr48_4x4_512x512_80k_vaihingen_20211231_231244-7133cb22.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18s_4x4_896x896_80k_isaid/fcn_hr18s_4x4_896x896_80k_isaid_20220118_001603-3cc0769b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr18_4x4_896x896_80k_isaid/fcn_hr18_4x4_896x896_80k_isaid_20220110_182230-49bf752e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/hrnet/hrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/hrnet/fcn_hr48_4x4_896x896_80k_isaid/fcn_hr48_4x4_896x896_80k_isaid_20220114_174643-547fc420.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_in1k-pre_832x832_80k_cityscapes/icnet_r50-d8_in1k-pre_832x832_80k_cityscapes_20210926_032943-1743dc7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_in1k-pre_832x832_160k_cityscapes/icnet_r50-d8_in1k-pre_832x832_160k_cityscapes_20210926_042715-ce310aea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_832x832_80k_cityscapes/icnet_r50-d8_832x832_80k_cityscapes_20210926_044625-c6407341.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r50-d8_832x832_160k_cityscapes/icnet_r50-d8_832x832_160k_cityscapes_20210925_232612-a95f0d4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_in1k-pre_832x832_80k_cityscapes/icnet_r18-d8_in1k-pre_832x832_80k_cityscapes_20210925_230354-1cbe3022.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_in1k-pre_832x832_160k_cityscapes/icnet_r18-d8_in1k-pre_832x832_160k_cityscapes_20210926_052702-619c8ae1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_832x832_80k_cityscapes/icnet_r18-d8_832x832_80k_cityscapes_20210925_225521-2e36638d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r18-d8_832x832_160k_cityscapes/icnet_r18-d8_832x832_160k_cityscapes_20210925_230153-2c6eb6e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_in1k-pre_832x832_80k_cityscapes/icnet_r101-d8_in1k-pre_832x832_80k_cityscapes_20210926_020414-7ceb12c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_in1k-pre_832x832_160k_cityscapes/icnet_r101-d8_in1k-pre_832x832_160k_cityscapes_20210925_232612-9484ae8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_832x832_80k_cityscapes/icnet_r101-d8_832x832_80k_cityscapes_20210926_072447-b52f936e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/icnet/icnet_r101-d8_832x832_160k_cityscapes/icnet_r101-d8_832x832_160k_cityscapes_20210926_092350-3a1ebf1a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/icnet/icnet.yml | https://arxiv.org/abs/1704.08545 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_769x769_80k_cityscapes/isanet_r50-d8_769x769_80k_cityscapes_20210903_101126-99b54519.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_769x769_40k_cityscapes/isanet_r50-d8_769x769_40k_cityscapes_20210903_050200-4ae7e65b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_80k_ade20k/isanet_r50-d8_512x512_80k_ade20k_20210903_124557-6ed83a0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_40k_voc12aug/isanet_r50-d8_512x512_40k_voc12aug_20210901_151349-7d08a54e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_20k_voc12aug/isanet_r50-d8_512x512_20k_voc12aug_20210901_164838-79d59b80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x512_160k_ade20k/isanet_r50-d8_512x512_160k_ade20k_20210903_104850-f752d0a3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x1024_80k_cityscapes/isanet_r50-d8_512x1024_80k_cityscapes_20210901_074202-89384497.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r50-d8_512x1024_40k_cityscapes/isanet_r50-d8_512x1024_40k_cityscapes_20210901_054739-981bd763.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_769x769_80k_cityscapes/isanet_r101-d8_769x769_80k_cityscapes_20210903_111319-24f71dfa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_769x769_40k_cityscapes/isanet_r101-d8_769x769_40k_cityscapes_20210903_111320-509e7224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_80k_ade20k/isanet_r101-d8_512x512_80k_ade20k_20210903_162056-68b235c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_40k_voc12aug/isanet_r101-d8_512x512_40k_voc12aug_20210901_145814-bc71233b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_20k_voc12aug/isanet_r101-d8_512x512_20k_voc12aug_20210901_115805-3ccbf355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x512_160k_ade20k/isanet_r101-d8_512x512_160k_ade20k_20210903_211431-a7879dcd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x1024_80k_cityscapes/isanet_r101-d8_512x1024_80k_cityscapes_20210901_145243-5b99c9b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/isanet/isanet_r101-d8_512x1024_40k_cityscapes/isanet_r101-d8_512x1024_40k_cityscapes_20210901_145553-293e6bd6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/isanet/isanet.yml | https://arxiv.org/abs/1907.12273 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k_20220303_133059-7545e1dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k_20220301_220747-8787fc71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k_20220303_154559-d8da9a90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_upernet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220304_125657-215753b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_pspnet_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_054634-d2c72240.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_fcn_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_043751-abcab920.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/knet/knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k/knet_s3_deeplabv3_r50-d8_8x2_512x512_adamw_80k_ade20k_20220228_041642-00c8fbeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet.yml | https://arxiv.org/abs/2106.14855 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet_s3_upernet_swin-l_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet_s3_upernet_swin-l_8x2_640x640_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_large_patch4_window7_224_22k_20220308-d5bdebaf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/knet/knet_s3_upernet_swin-t_8x2_512x512_adamw_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220308-f41b89d3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mae/mae.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mae/upernet_mae-base_fp16_8x2_512x512_160k_ade20k/upernet_mae-base_fp16_8x2_512x512_160k_ade20k_20220426_174752-f92a2975.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x1024_80k_cityscapes/fcn_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-d24c28c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x1024_80k_cityscapes/pspnet_m-v2-d8_512x1024_80k_cityscapes_20200825_124817-19e81d51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x1024_80k_cityscapes/deeplabv3_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-bef03590.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes/deeplabv3plus_m-v2-d8_512x1024_80k_cityscapes_20200825_124836-d256dd4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/fcn_m-v2-d8_512x512_160k_ade20k/fcn_m-v2-d8_512x512_160k_ade20k_20200825_214953-c40e1095.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/pspnet_m-v2-d8_512x512_160k_ade20k/pspnet_m-v2-d8_512x512_160k_ade20k_20200825_214953-f5942f7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3_m-v2-d8_512x512_160k_ade20k/deeplabv3_m-v2-d8_512x512_160k_ade20k_20200825_223255-63986343.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v2/mobilenet_v2.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v2/deeplabv3plus_m-v2-d8_512x512_160k_ade20k/deeplabv3plus_m-v2-d8_512x512_160k_ade20k_20200825_223255-465a01d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://arxiv.org/abs/1905.02244 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_512x1024_320k_cityscapes/lraspp_m-v3-d8_512x1024_320k_cityscapes_20201224_220337-cfe8fb07.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3-d8_scratch_512x1024_320k_cityscapes_20201224_220337-9f29cd72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_512x1024_320k_cityscapes/lraspp_m-v3s-d8_512x1024_320k_cityscapes_20201224_223935-61565b34.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/mobilenet_v3/mobilenet_v3.yml | https://download.openmmlab.com/mmsegmentation/v0.5/mobilenet_v3/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes/lraspp_m-v3s-d8_scratch_512x1024_320k_cityscapes_20201224_223935-03daeabb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://arxiv.org/abs/1711.07971 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_40k_cityscapes/nonlocal_r50-d8_512x1024_40k_cityscapes_20200605_210748-c75e81e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_40k_cityscapes/nonlocal_r101-d8_512x1024_40k_cityscapes_20200605_210748-d63729fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_40k_cityscapes/nonlocal_r50-d8_769x769_40k_cityscapes_20200530_045243-82ef6749.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_40k_cityscapes/nonlocal_r101-d8_769x769_40k_cityscapes_20200530_045348-8fe9a9dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x1024_80k_cityscapes/nonlocal_r50-d8_512x1024_80k_cityscapes_20200607_193518-d6839fae.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x1024_80k_cityscapes/nonlocal_r101-d8_512x1024_80k_cityscapes_20200607_183411-32700183.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_769x769_80k_cityscapes/nonlocal_r50-d8_769x769_80k_cityscapes_20200607_193506-1f9792f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_769x769_80k_cityscapes/nonlocal_r101-d8_769x769_80k_cityscapes_20200607_183428-0e1fa4f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_80k_ade20k/nonlocal_r50-d8_512x512_80k_ade20k_20200615_015801-5ae0aa33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_80k_ade20k/nonlocal_r101-d8_512x512_80k_ade20k_20200615_015758-24105919.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_160k_ade20k/nonlocal_r50-d8_512x512_160k_ade20k_20200616_005410-baef45e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_160k_ade20k/nonlocal_r101-d8_512x512_160k_ade20k_20210827_221502-7881aa1a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_20k_voc12aug/nonlocal_r50-d8_512x512_20k_voc12aug_20200617_222613-07f2a57c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_20k_voc12aug/nonlocal_r101-d8_512x512_20k_voc12aug_20200617_222615-948c68ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r50-d8_512x512_40k_voc12aug/nonlocal_r50-d8_512x512_40k_voc12aug_20200614_000028-0139d4a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/nonlocal_net/nonlocal_net.yml | https://download.openmmlab.com/mmsegmentation/v0.5/nonlocal_net/nonlocal_r101-d8_512x512_40k_voc12aug/nonlocal_r101-d8_512x512_40k_voc12aug_20200614_000028-7e5ff470.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_80k_b16_cityscapes/ocrnet_r101-d8_512x1024_80k_b16_cityscapes_20200723_192421-78688424.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_40k_cityscapes/ocrnet_hr18s_512x1024_40k_cityscapes_20200601_033304-fa2436c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_40k_cityscapes/ocrnet_hr18_512x1024_40k_cityscapes_20200601_033320-401c5bdd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_40k_cityscapes/ocrnet_hr48_512x1024_40k_cityscapes_20200601_033336-55b32491.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_80k_cityscapes/ocrnet_hr18s_512x1024_80k_cityscapes_20200601_222735-55979e63.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_80k_cityscapes/ocrnet_hr18_512x1024_80k_cityscapes_20200614_230521-c2e1dd4a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_80k_cityscapes/ocrnet_hr48_512x1024_80k_cityscapes_20200601_222752-9076bcdf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x1024_160k_cityscapes/ocrnet_hr18s_512x1024_160k_cityscapes_20200602_191005-f4a7af28.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x1024_160k_cityscapes/ocrnet_hr18_512x1024_160k_cityscapes_20200602_191001-b9172d0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x1024_160k_cityscapes/ocrnet_hr48_512x1024_160k_cityscapes_20200602_191037-dfbf1b0c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b8_cityscapes/ocrnet_r101-d8_512x1024_40k_b8_cityscapes_20200717_110721-02ac0f13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_r101-d8_512x1024_40k_b16_cityscapes/ocrnet_r101-d8_512x1024_40k_b16_cityscapes_20200723_193726-db500f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_80k_ade20k/ocrnet_hr18s_512x512_80k_ade20k_20200615_055600-e80b62af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_80k_ade20k/ocrnet_hr18_512x512_80k_ade20k_20200615_053157-d173d83b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_80k_ade20k/ocrnet_hr48_512x512_80k_ade20k_20200615_021518-d168c2d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_160k_ade20k/ocrnet_hr18s_512x512_160k_ade20k_20200615_184505-8e913058.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_160k_ade20k/ocrnet_hr18_512x512_160k_ade20k_20200615_200940-d8fcd9d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_160k_ade20k/ocrnet_hr48_512x512_160k_ade20k_20200615_184705-a073726d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_20k_voc12aug/ocrnet_hr18s_512x512_20k_voc12aug_20200617_233913-02b04fcb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_20k_voc12aug/ocrnet_hr18_512x512_20k_voc12aug_20200617_233932-8954cbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_20k_voc12aug/ocrnet_hr48_512x512_20k_voc12aug_20200617_233932-9e82080a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18s_512x512_40k_voc12aug/ocrnet_hr18s_512x512_40k_voc12aug_20200614_002025-42b587ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr18_512x512_40k_voc12aug/ocrnet_hr18_512x512_40k_voc12aug_20200614_015958-714302be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/ocrnet/ocrnet_hr48_512x512_40k_voc12aug/ocrnet_hr48_512x512_40k_voc12aug_20200614_015958-255bc5ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/ocrnet/ocrnet.yml | https://arxiv.org/abs/1909.11065 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/point_rend/point_rend.yml | https://arxiv.org/abs/1912.08193 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x1024_80k_cityscapes/pointrend_r50_512x1024_80k_cityscapes_20200711_015821-bb1ff523.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x1024_80k_cityscapes/pointrend_r101_512x1024_80k_cityscapes_20200711_170850-d0ca84be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r50_512x512_160k_ade20k/pointrend_r50_512x512_160k_ade20k_20200807_232644-ac3febf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/point_rend/point_rend.yml | https://download.openmmlab.com/mmsegmentation/v0.5/point_rend/pointrend_r101_512x512_160k_ade20k/pointrend_r101_512x512_160k_ade20k_20200808_030852-8834902a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://openaccess.thecvf.com/content_ECCV_2018/papers/Hengshuang_Zhao_PSANet_Point-wise_Spatial_ECCV_2018_paper.pdf | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_40k_cityscapes/psanet_r50-d8_512x1024_40k_cityscapes_20200606_103117-99fac37c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_40k_cityscapes/psanet_r101-d8_512x1024_40k_cityscapes_20200606_001418-27b9cfa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_40k_cityscapes/psanet_r50-d8_769x769_40k_cityscapes_20200530_033717-d5365506.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_40k_cityscapes/psanet_r101-d8_769x769_40k_cityscapes_20200530_035107-997da1e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x1024_80k_cityscapes/psanet_r50-d8_512x1024_80k_cityscapes_20200606_161842-ab60a24f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x1024_80k_cityscapes/psanet_r101-d8_512x1024_80k_cityscapes_20200606_161823-0f73a169.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_769x769_80k_cityscapes/psanet_r50-d8_769x769_80k_cityscapes_20200606_225134-fe42f49e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_769x769_80k_cityscapes/psanet_r101-d8_769x769_80k_cityscapes_20200606_214550-7665827b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_80k_ade20k/psanet_r50-d8_512x512_80k_ade20k_20200614_144141-835e4b97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_80k_ade20k/psanet_r101-d8_512x512_80k_ade20k_20200614_185117-1fab60d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_160k_ade20k/psanet_r50-d8_512x512_160k_ade20k_20200615_161258-148077dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_160k_ade20k/psanet_r101-d8_512x512_160k_ade20k_20200615_161537-dbfa564c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_20k_voc12aug/psanet_r50-d8_512x512_20k_voc12aug_20200617_102413-2f1bbaa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_20k_voc12aug/psanet_r101-d8_512x512_20k_voc12aug_20200617_110624-946fef11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r50-d8_512x512_40k_voc12aug/psanet_r50-d8_512x512_40k_voc12aug_20200613_161946-f596afb5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/psanet/psanet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/psanet/psanet_r101-d8_512x512_40k_voc12aug/psanet_r101-d8_512x512_40k_voc12aug_20200613_161946-1f560f9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_40k_cityscapes/pspnet_r50-d8_512x1024_40k_cityscapes_20200605_003338-2966598c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_40k_cityscapes/pspnet_r101-d8_512x1024_40k_cityscapes_20200604_232751-467e7cf4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_40k_cityscapes/pspnet_r50-d8_769x769_40k_cityscapes_20200606_112725-86638686.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_40k_cityscapes/pspnet_r101-d8_769x769_40k_cityscapes_20200606_112753-61c6f5be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x1024_80k_cityscapes/pspnet_r18-d8_512x1024_80k_cityscapes_20201225_021458-09ffa746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x1024_80k_cityscapes/pspnet_r50-d8_512x1024_80k_cityscapes_20200606_112131-2376f12b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x1024_80k_cityscapes/pspnet_r101-d8_512x1024_80k_cityscapes_20200606_112211-e1e1100f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_769x769_80k_cityscapes/pspnet_r18-d8_769x769_80k_cityscapes_20201225_021458-3deefc62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_769x769_80k_cityscapes/pspnet_r50-d8_769x769_80k_cityscapes_20200606_210121-5ccf03dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_769x769_80k_cityscapes/pspnet_r101-d8_769x769_80k_cityscapes_20200606_225055-dba412fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_512x1024_80k_cityscapes/pspnet_r18b-d8_512x1024_80k_cityscapes_20201226_063116-26928a60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_512x1024_80k_cityscapes/pspnet_r50b-d8_512x1024_80k_cityscapes_20201225_094315-6344287a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_512x1024_80k_cityscapes/pspnet_r101b-d8_512x1024_80k_cityscapes_20201226_170012-3a4d38ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18b-d8_769x769_80k_cityscapes/pspnet_r18b-d8_769x769_80k_cityscapes_20201226_080942-bf98d186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d8_769x769_80k_cityscapes/pspnet_r50b-d8_769x769_80k_cityscapes_20201225_094316-4c643cf6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101b-d8_769x769_80k_cityscapes/pspnet_r101b-d8_769x769_80k_cityscapes_20201226_171823-f0e7c293.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_ade20k/pspnet_r50-d8_512x512_80k_ade20k_20200615_014128-15a8b914.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_ade20k/pspnet_r101-d8_512x512_80k_ade20k_20200614_031423-b6e782f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_160k_ade20k/pspnet_r50-d8_512x512_160k_ade20k_20200615_184358-1890b0bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_160k_ade20k/pspnet_r101-d8_512x512_160k_ade20k_20200615_100650-967c316f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_20k_voc12aug/pspnet_r50-d8_512x512_20k_voc12aug_20200617_101958-ed5dfbd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_20k_voc12aug/pspnet_r101-d8_512x512_20k_voc12aug_20200617_102003-4aef3c9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_40k_voc12aug/pspnet_r50-d8_512x512_40k_voc12aug_20200613_161222-ae9c1b8c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_40k_voc12aug/pspnet_r101-d8_512x512_40k_voc12aug_20200613_161222-bc933b18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context/pspnet_r101-d8_480x480_40k_pascal_context_20200911_211210-bf0f5d7c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context/pspnet_r101-d8_480x480_80k_pascal_context_20200911_190530-c86d6233.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_40k_pascal_context_59/pspnet_r101-d8_480x480_40k_pascal_context_59_20210416_114524-86d44cd4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_480x480_80k_pascal_context_59/pspnet_r101-d8_480x480_80k_pascal_context_59_20210416_114418-fa6caaa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220315_123238-588c30be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_fp16_512x1024_80k_cityscapes/pspnet_r101-d8_fp16_512x1024_80k_cityscapes_20200717_230919-a0875e5c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d32_512x1024_80k_cityscapes/pspnet_r50-d32_512x1024_80k_cityscapes_20220316_224840-9092b254.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes_20220316_141229-dd9c9610.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50b-d32_512x1024_80k_cityscapes/pspnet_r50b-d32_512x1024_80k_cityscapes_20220311_152152-23bcaf8c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k/pspnet_r50-d8_512x512_4x4_20k_coco-stuff10k_20210820_203258-b88df27f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k/pspnet_r101-d8_512x512_4x4_20k_coco-stuff10k_20210820_232135-76aae482.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k/pspnet_r50-d8_512x512_4x4_40k_coco-stuff10k_20210821_030857-92e2902b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k/pspnet_r101-d8_512x512_4x4_40k_coco-stuff10k_20210821_014022-831aec95.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-0e41b2db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_80k_coco-stuff164k_20210707_152034-7eb41789.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-51276a57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_160k_coco-stuff164k_20210707_152004-4af9621b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k/pspnet_r50-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-be9610cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k/pspnet_r101-d8_512x512_4x4_320k_coco-stuff164k_20210707_152004-72220c60.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_512x512_80k_loveda/pspnet_r18-d8_512x512_80k_loveda_20211105_052100-b97697f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_512x512_80k_loveda/pspnet_r50-d8_512x512_80k_loveda_20211104_155728-88610f9f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_512x512_80k_loveda/pspnet_r101-d8_512x512_80k_loveda_20211104_153212-1c06c6a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_512x512_80k_potsdam/pspnet_r18-d8_4x4_512x512_80k_potsdam_20211220_125612-7cd046e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_512x512_80k_potsdam/pspnet_r50-d8_4x4_512x512_80k_potsdam_20211219_043541-2dd5fe67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_4x4_512x512_80k_potsdam/pspnet_r101-d8_4x4_512x512_80k_potsdam_20211220_125612-aed036c4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_512x512_80k_vaihingen/pspnet_r18-d8_4x4_512x512_80k_vaihingen_20211228_160355-52a8a6f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_512x512_80k_vaihingen/pspnet_r50-d8_4x4_512x512_80k_vaihingen_20211228_160355-382f8f5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r101-d8_4x4_512x512_80k_vaihingen/pspnet_r101-d8_4x4_512x512_80k_vaihingen_20211231_230806-8eba0a09.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r18-d8_4x4_896x896_80k_isaid/pspnet_r18-d8_4x4_896x896_80k_isaid_20220110_180526-e84c0b6a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/pspnet/pspnet_r50-d8_4x4_896x896_80k_isaid/pspnet_r50-d8_4x4_896x896_80k_isaid_20220110_180629-1f21dc32.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet.yml | https://arxiv.org/abs/1612.01105 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet_r50-d32_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/pspnet/pspnet_r50-d8_rsb-pretrain_512x1024_adamw_80k_cityscapes.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x512_160k_ade20k/fcn_s101-d8_512x512_160k_ade20k_20200807_145416-d3160329.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/fcn_s101-d8_512x1024_80k_cityscapes/fcn_s101-d8_512x1024_80k_cityscapes_20200807_140631-f8d155b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x1024_80k_cityscapes/pspnet_s101-d8_512x1024_80k_cityscapes_20200807_140631-c75f3b99.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x1024_80k_cityscapes/deeplabv3_s101-d8_512x1024_80k_cityscapes_20200807_144429-b73c4270.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x1024_80k_cityscapes/deeplabv3plus_s101-d8_512x1024_80k_cityscapes_20200807_144429-1239eb43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/pspnet_s101-d8_512x512_160k_ade20k/pspnet_s101-d8_512x512_160k_ade20k_20200807_145416-a6daa92a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3_s101-d8_512x512_160k_ade20k/deeplabv3_s101-d8_512x512_160k_ade20k_20200807_144503-17ecabe5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/resnest/resnest.yml | https://download.openmmlab.com/mmsegmentation/v0.5/resnest/deeplabv3plus_s101-d8_512x512_160k_ade20k/deeplabv3plus_s101-d8_512x512_160k_ade20k_20200807_144503-27b26226.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes/segformer_mit-b5_8x1_1024x1024_160k_cityscapes_20211206_072934-87a052ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_640x640_160k_ade20k/segformer_mit-b5_640x640_160k_ade20k_20220617_203542-940a6bd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b5_512x512_160k_ade20k/segformer_mit-b5_512x512_160k_ade20k_20210726_145235-94cedf59.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes/segformer_mit-b4_8x1_1024x1024_160k_cityscapes_20211207_080709-07f6c333.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b4_512x512_160k_ade20k/segformer_mit-b4_512x512_160k_ade20k_20220620_112216-4fa4f58f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes/segformer_mit-b3_8x1_1024x1024_160k_cityscapes_20211206_224823-a8f8a177.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b3_512x512_160k_ade20k/segformer_mit-b3_512x512_160k_ade20k_20220617_162254-3a4b7363.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes/segformer_mit-b2_8x1_1024x1024_160k_cityscapes_20211207_134205-6096669a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b2_512x512_160k_ade20k/segformer_mit-b2_512x512_160k_ade20k_20220620_114047-64e4feca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes/segformer_mit-b1_8x1_1024x1024_160k_cityscapes_20211208_064213-655c7b3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b1_512x512_160k_ade20k/segformer_mit-b1_512x512_160k_ade20k_20220620_112037-c3f39e00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes/segformer_mit-b0_8x1_1024x1024_160k_cityscapes_20211208_101857-e7f88502.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segformer/segformer_mit-b0_512x512_160k_ade20k/segformer_mit-b0_512x512_160k_ade20k_20220617_162207-c00b9603.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer.yml | https://arxiv.org/abs/2105.15203 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b0_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b0_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b0_20220624-7e0fe6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b1_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b1_20220624-02e5a6a1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b2_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b2_20220624-66e8bf70.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b3_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b3_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b3_20220624-13b1141c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b4_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b4_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b4_20220624-d588d980.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b5_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pthh | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b5_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pthh | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segformer/segformer_mit-b5_8x1_1024x1024_160k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segformer/mit_b5_20220624-658746d9.pthh | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k/segmenter_vit-t_mask_8x1_512x512_160k_ade20k_20220105_151706-ffcf7509.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k/segmenter_vit-s_mask_8x1_512x512_160k_ade20k_20220105_151706-511bb103.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-s_linear_8x1_512x512_160k_ade20k/segmenter_vit-s_linear_8x1_512x512_160k_ade20k_20220105_151713-39658c46.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k/segmenter_vit-l_mask_8x1_640x640_160k_ade20k_20220614_024513-4783a347.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter.yml | https://download.openmmlab.com/mmsegmentation/v0.5/segmenter/segmenter_vit-b_mask_8x1_512x512_160k_ade20k/segmenter_vit-b_mask_8x1_512x512_160k_ade20k_20220105_151706-bc533b08.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter.yml | https://arxiv.org/abs/2105.05633 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter_vit-l_mask_8x1_640x640_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_large_p16_384_20220308-d4efb41d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter_vit-s_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_small_p16_384_20220308-410f6037.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/segmenter/segmenter_vit-t_mask_8x1_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/segmenter/vit_tiny_p16_384_20220308-cce8c795.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://arxiv.org/abs/1901.02446 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x1024_80k_cityscapes/fpn_r50_512x1024_80k_cityscapes_20200717_021437-94018a0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x1024_80k_cityscapes/fpn_r101_512x1024_80k_cityscapes_20200717_012416-c5800d4c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r50_512x512_160k_ade20k/fpn_r50_512x512_160k_ade20k_20200718_131734-5b5a6ab9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/sem_fpn/sem_fpn.yml | https://download.openmmlab.com/mmsegmentation/v0.5/sem_fpn/fpn_r101_512x512_160k_ade20k/fpn_r101_512x512_160k_ade20k_20200718_131734-306b5004.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_pup_vit-large_8x1_768x768_80k_cityscapes/setr_pup_vit-large_8x1_768x768_80k_cityscapes_20211122_155115-f6f37b8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_pup_512x512_160k_b16_ade20k/setr_pup_512x512_160k_b16_ade20k_20210619_191343-7e0ce826.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_naive_vit-large_8x1_768x768_80k_cityscapes/setr_naive_vit-large_8x1_768x768_80k_cityscapes_20211123_000505-20728e80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_naive_512x512_160k_b16_ade20k/setr_naive_512x512_160k_b16_ade20k_20210619_191258-061f24f5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_vit-large_8x1_768x768_80k_cityscapes/setr_mla_vit-large_8x1_768x768_80k_cityscapes_20211119_101003-7f8dccbe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_512x512_160k_b8_ade20k/setr_mla_512x512_160k_b8_ade20k_20210619_191118-c6d21df0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://download.openmmlab.com/mmsegmentation/v0.5/setr/setr_mla_512x512_160k_b16_ade20k/setr_mla_512x512_160k_b16_ade20k_20210619_191057-f9741de7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/setr/setr.yml | https://arxiv.org/abs/2012.15840 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes/stdc2_in1k-pre_512x1024_80k_cityscapes_20220224_073048-1f8f0f6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc2_512x1024_80k_cityscapes/stdc2_512x1024_80k_cityscapes_20220222_132015-fb1e3a1a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes/stdc1_in1k-pre_512x1024_80k_cityscapes_20220224_141648-3d4c2981.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc.yml | https://download.openmmlab.com/mmsegmentation/v0.5/stdc/stdc1_512x1024_80k_cityscapes/stdc1_512x1024_80k_cityscapes_20220224_073048-74e6920a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc.yml | https://arxiv.org/abs/2104.13188 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc1_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc1_20220308-5368626c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/stdc/stdc2_in1k-pre_512x1024_80k_cityscapes.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/stdc/stdc2_20220308-7dbd9127.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210531_112542-e380ad3e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192015-ee2fff1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K_20210526_211650-762e2178.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K_20210526_192340-593b0e13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K_20210531_125459-429057bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/swin.yml | https://download.openmmlab.com/mmsegmentation/v0.5/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K_20210531_132020-05b22ea4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_20220317-55b0104a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/upernet_swin_base_patch4_window12_512x512_160k_ade20k_pretrain_384x384_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window12_384_22k_20220317-e5c09f74.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_20220317-e9b98025.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/upernet_swin_base_patch4_window7_512x512_160k_ade20k_pretrain_224x224_22K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_base_patch4_window7_224_22k_20220317-4f79f7c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/upernet_swin_small_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_small_patch4_window7_224_20220317-7ba6d6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/swin/upernet_swin_tiny_patch4_window7_512x512_160k_ade20k_pretrain_224x224_1K.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/swin/swin_tiny_patch4_window7_224_20220317-1cdeb081.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k/twins_svt-s_uperhead_8x2_512x512_160k_ade20k_20211130_141005-e48a2d94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141006-0a0d3317.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k/twins_svt-l_uperhead_8x2_512x512_160k_ade20k_20211130_141005-3e2cae61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k/twins_svt-l_uperhead_8x2_512x512_160k_ade20k_20211130_141005-3e2cae61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k/twins_svt-b_uperhead_8x2_512x512_160k_ade20k_20211202_040826-0943a1f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_113849-88b2907c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k/twins_pcpvt-s_uperhead_8x4_512x512_160k_ade20k_20211201_233537-8e99c07a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-s_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_204132-41acd132.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k_20211201_075053-c6095c07.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k_20211201_105226-bc6d61dc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k_20211130_141020-02094ea5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins.yml | https://download.openmmlab.com/mmsegmentation/v0.5/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k_20211130_141019-d396db72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_pcpvt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_pcpvt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_base_20220308-0621964c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_pcpvt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_pcpvt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/pcpvt_large_20220308-37579dc6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_svt-b_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_svt-b_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_base_20220308-1b7eb711.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_svt-l_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_svt-l_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_large_20220308-fb5936f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_svt-s_fpn_fpnhead_8x4_512x512_80k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/twins/twins_svt-s_uperhead_8x2_512x512_160k_ade20k.py | https://download.openmmlab.com/mmsegmentation/v0.5/pretrain/twins/alt_gvt_small_20220308-7e1c3695.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/pspnet_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201821-22b3e3ba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/pspnet_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201823-53d492fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201823-f1063ef7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/pspnet_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201823-c0802c4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_64x64_40k_drive/pspnet_unet_s5-d16_64x64_40k_drive_20201227_181818-aac73387.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_256x256_40k_hrf/pspnet_unet_s5-d16_256x256_40k_hrf_20201227_181818-fdb7e29b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_stare/pspnet_unet_s5-d16_128x128_40k_stare_20201227_181818-3c2923c4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/pspnet_unet_s5-d16_128x128_40k_chase_db1/pspnet_unet_s5-d16_128x128_40k_chase_db1_20201227_181818-68d4e609.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/fcn_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201820-785de5c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/fcn_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_201821-c314da8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201821-f75705a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/fcn_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201821-1c4eb7cf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_64x64_40k_drive/fcn_unet_s5-d16_64x64_40k_drive_20201223_191051-5daf6d3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes/fcn_unet_s5-d16_4x4_512x1024_160k_cityscapes_20211210_145204-6860854e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_256x256_40k_hrf/fcn_unet_s5-d16_256x256_40k_hrf_20201223_173724-d89cf1ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_stare/fcn_unet_s5-d16_128x128_40k_stare_20201223_191051-7d77e78b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/fcn_unet_s5-d16_128x128_40k_chase_db1/fcn_unet_s5-d16_128x128_40k_chase_db1_20201223_191051-11543527.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_64x64_40k_drive_20211210_201825-6bf0efd7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_256x256_40k_hrf_20211210_202032-59daf7a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_stare_20211210_201825-21db614c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1/deeplabv3_unet_s5-d16_ce-1.0-dice-3.0_128x128_40k_chase-db1_20211210_201825-4ef29df5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_64x64_40k_drive/deeplabv3_unet_s5-d16_64x64_40k_drive_20201226_094047-0671ff20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_256x256_40k_hrf/deeplabv3_unet_s5-d16_256x256_40k_hrf_20201226_094047-3a1fdf85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_stare/deeplabv3_unet_s5-d16_128x128_40k_stare_20201226_094047-93dcb93c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/unet/deeplabv3_unet_s5-d16_128x128_40k_chase_db1/deeplabv3_unet_s5-d16_128x128_40k_chase_db1_20201226_094047-4c5aefa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | https://arxiv.org/abs/1505.04597 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/unet/unet.yml | http://lmb.informatik.uni-freiburg.de/people/ronneber/u-net | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_40k_cityscapes/upernet_r50_512x1024_40k_cityscapes_20200605_094827-aa54cb54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_40k_cityscapes/upernet_r101_512x1024_40k_cityscapes_20200605_094933-ebce3b10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_40k_cityscapes/upernet_r50_769x769_40k_cityscapes_20200530_033048-92d21539.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_40k_cityscapes/upernet_r101_769x769_40k_cityscapes_20200530_040819-83c95d01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x1024_80k_cityscapes/upernet_r50_512x1024_80k_cityscapes_20200607_052207-848beca8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x1024_80k_cityscapes/upernet_r101_512x1024_80k_cityscapes_20200607_002403-f05f2345.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_769x769_80k_cityscapes/upernet_r50_769x769_80k_cityscapes_20200607_005107-82ae7d15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_769x769_80k_cityscapes/upernet_r101_769x769_80k_cityscapes_20200607_001014-082fc334.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_80k_ade20k/upernet_r50_512x512_80k_ade20k_20200614_144127-ecc8377b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_80k_ade20k/upernet_r101_512x512_80k_ade20k_20200614_185117-32e4db94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_160k_ade20k/upernet_r50_512x512_160k_ade20k_20200615_184328-8534de8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_160k_ade20k/upernet_r101_512x512_160k_ade20k_20200615_161951-91b32684.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_20k_voc12aug/upernet_r50_512x512_20k_voc12aug_20200617_165330-5b5890a7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_20k_voc12aug/upernet_r101_512x512_20k_voc12aug_20200617_165629-f14e7f27.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r50_512x512_40k_voc12aug/upernet_r50_512x512_40k_voc12aug_20200613_162257-ca9bcc6b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r101_512x512_40k_voc12aug/upernet_r101_512x512_40k_voc12aug_20200613_163549-e26476ac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x1024_80k_cityscapes/upernet_r18_512x1024_80k_cityscapes_20220614_110712-c89a9188.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_80k_ade20k/upernet_r18_512x512_80k_ade20k_20220614_110319-22e81719.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_160k_ade20k/upernet_r18_512x512_160k_ade20k_20220615_113300-791c3f3e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://arxiv.org/pdf/1807.10221.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_20k_voc12aug/upernet_r18_512x512_20k_voc12aug_20220614_123910-ed66e455.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x512_40k_voc12aug/upernet_r18_512x512_40k_voc12aug_20220614_153605-fafeb868.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/upernet/upernet.yml | https://download.openmmlab.com/mmsegmentation/v0.5/upernet/upernet_r18_512x1024_40k_cityscapes/upernet_r18_512x1024_40k_cityscapes_20220615_113231-12ee861d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_mln_512x512_80k_ade20k/upernet_vit-b16_mln_512x512_80k_ade20k_20210624_130547-0403cee1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_mln_512x512_160k_ade20k/upernet_vit-b16_mln_512x512_160k_ade20k_20210624_130547-852fa768.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_vit-b16_ln_mln_512x512_160k_ade20k/upernet_vit-b16_ln_mln_512x512_160k_ade20k_20210621_172828-f444c077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_mln_512x512_160k_ade20k/upernet_deit-s16_mln_512x512_160k_ade20k_20210621_161021-fb9a5dfb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_ln_mln_512x512_160k_ade20k/upernet_deit-s16_ln_mln_512x512_160k_ade20k_20210621_161021-c0cd652f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_512x512_80k_ade20k/upernet_deit-s16_512x512_80k_ade20k_20210624_095228-afc93ec2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-s16_512x512_160k_ade20k/upernet_deit-s16_512x512_160k_ade20k_20210621_160903-5110d916.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_mln_512x512_160k_ade20k/upernet_deit-b16_mln_512x512_160k_ade20k_20210621_191949-4e1450f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_ln_mln_512x512_160k_ade20k/upernet_deit-b16_ln_mln_512x512_160k_ade20k_20210623_153535-8a959c14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_512x512_80k_ade20k/upernet_deit-b16_512x512_80k_ade20k_20210624_130529-1e090789.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/configs/vit/vit.yml | https://download.openmmlab.com/mmsegmentation/v0.5/vit/upernet_deit-b16_512x512_160k_ade20k/upernet_deit-b16_512x512_160k_ade20k_20210621_180100-828705d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/Jenkinsfile | https://mirrors.aliyun.com/pypi/simple | 环境创建所使用的源 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-large-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-b20ba619.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-base-p32_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-9cea8599.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vit/finetune/vit-base-p16_in21k-pre-3rdparty_ft-64xb64_in1k-384_20210928-98e8652b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_batch256_imagenet_20210208-da620c4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_batch256_imagenet_20210208-e6920e4a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_batch256_imagenet_20210208-7e55cd29.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_batch256_imagenet_20210208-db26f1a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_batch256_imagenet_20210207-1a8b7864.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_batch256_imagenet_20210208-4d1d6080.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_batch256_imagenet_20210207-f244902c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_batch256_imagenet_20210208-4271cd6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/tnt/tnt-small-p16_3rdparty_in1k_20210903-c56ee7df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-24_3rdparty_8xb64_in1k_20210928-fe95a61b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-19_3rdparty_8xb64_in1k_20210928-7f1478d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/t2t-vit/t2t-vit-t-14_3rdparty_8xb64_in1k_20210928-b7c09b62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_tiny_224_b16x64_300e_imagenet_20210616_090925-66df6be6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/swin_small_224_b16x64_300e_imagenet_20210615_110219-7f9d988b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_large_patch4_window7_224_22kto1k-5f0996db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/swin-transformer/convert/swin_base_patch4_window7_224_22kto1k-f967f799.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_b32x8_imagenet_20210429-56066e27.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_b32x8_imagenet_20210524-927787be.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_b32x8_imagenet_20210506-23a247d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_b32x8_imagenet_20210506-e0fa3dd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_b32x8_imagenet_20210531-db14775a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_8xb32_in1k_20210831-f257d4e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_8xb32_in1k_20210831-fbbb1da6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_b32x8_imagenet_20210531-278cf22a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_8xb32_in1k_20210901-4d7582fa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_b32x8_imagenet_20210531-6e13bcd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_8xb32_in1k_20210831-539c63f8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w26-s8_3rdparty_8xb32_in1k_20210927-f547a94b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/res2net/res2net50-w14-s8_3rdparty_8xb32_in1k_20210927-bc967bf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/res2net/res2net101-w26-s4_3rdparty_8xb32_in1k_20210927-870b6c36.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-D2se_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-cf3139b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-4e54846a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B3_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-dda968bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2g4_3rdparty_4xb64-autoaug-lbs-mixup-coslr-200e_in1k_20210909-7b7955f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B2_3rdparty_4xb64-coslr-120e_in1k_20210909-bd6b937c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g4_3rdparty_4xb64-coslr-120e_in1k_20210909-d4c1a642.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1g2_3rdparty_4xb64-coslr-120e_in1k_20210909-344f6422.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B1_3rdparty_4xb64-coslr-120e_in1k_20210909-750cdf67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-B0_3rdparty_4xb64-coslr-120e_in1k_20210909-446375f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A2_3rdparty_4xb64-coslr-120e_in1k_20210909-97d7695a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A1_3rdparty_4xb64-coslr-120e_in1k_20210909-24003a24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/repvgg/repvgg-A0_3rdparty_4xb64-coslr-120e_in1k_20210909-883ab98c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_small-8427ecf0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v3/convert/mobilenet_v3_large-3ea3c186.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmdetection/v2.0/third_party/mobilenet_v2_batch256_imagenet-ff34753d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | mmedit/mobilenet_v2": "https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet50-0676ba61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg13-19584684.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/vgg11-8a719046.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/squeezenet1_1-b8a52dc0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/squeezenet1_0-b66bff10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet34-b627a593.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet18-f37072fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet152-394f9c45.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/resnet101-63fe2227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_8gf-d0d0e4a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_800mf-1b27b58c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_400mf-c65dace8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_32gf-4dee3f7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_3_2gf-b5a9779c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_16gf-9e6ed7dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_y_1_6gf-b11a554e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_8gf-03ceed89.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_800mf-ad17e45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_400mf-adf1edd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_32gf-9d47f8d0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_3_2gf-f342aeae.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_16gf-2007eb11.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/regnet_x_1_6gf-e3633e7f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/mobilenet_v3_small-047dcff4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/mobilenet_v3_large-8738ca79.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/inception_v3_google-0cc3c7bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/googlenet-1378be20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b7_lukemelas-dcc49843.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b6_lukemelas-c76e70fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b5_lukemelas-b6417697.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b4_rwightman-7eb33cd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b3_rwightman-cf984f9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b2_rwightman-bcdf34b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b1_rwightman-533bc792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/efficientnet_b0_rwightman-3dd342df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/mmcv/model_zoo/torchvision_0.12.json | https://download.pytorch.org/models/alexnet-owt-7be5be79.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/mmcv/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MMseg-swin/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/public_address_statement.md index bec1ba6ba6dd69057d9d14fc0e1f431ee57782b0..bf7b9e21995869af94748116f54969e785aaae0c 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/public_address_statement.md @@ -1,20 +1,9 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|------------------------|--------| -| 开发引入 | / | MedSAM_for_PyTorch/assets/architecture.png | http://ns.adobe.com/tiff/1.0/ | 相关说明 | -| 开发引入 | / | MedSAM_for_PyTorch/segment_anything/modeling/image_encoder.py | https://github.com/facebookresearch/mvit/blob/19786631e330df9f3622e5402b4a419a263a2c80/mvit/models/attention.py | 源码实现 | -| 开发引入 | / | MedSAM_for_PyTorch/segment_anything/modeling/common.py | https://github.com/facebookresearch/detectron2/blob/main/detectron2/layers/batch_norm.py | 源码实现 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/MedSAM_Inference.py | MedSAM_for_PyTorch/MedSAM_Inference.py | https://github.com/facebookresearch/segment-anything/blob/main/notebooks/predictor_example.ipynb | 源码实现 | -| 开发引入 | / | MedSAM_for_PyTorch/segment_anything/modeling/mask_decoder.py | https://github.com/facebookresearch/MaskFormer/blob/main/mask_former/modeling/transformer/transformer_predictor.py | 源码实现 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/utils/format_convert.py | MedSAM_for_PyTorch/utils/format_convert.py | https://stackoverflow.com/a/46574906/4521646 | 相关说明 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/utils/README.md | MedSAM_for_PyTorch/pre_CT_MR.py | https://radiopaedia.org/articles/windowing-ct | 相关说明 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/utils/SurfaceDice.py | MedSAM_for_PyTorch/utils/SurfaceDice.py | http://medicaldecathlon.com/files/Surface_distance_based_measures.ipynb | 相关说明 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/utils/SurfaceDice.py | MedSAM_for_PyTorch/utils/SurfaceDice.py | https://en.wikipedia.org/wiki/Marching_cubes | 相关说明 | -| 开发引入 | / | MedSAM_for_PyTorch/segment_anything/modeling/image_encoder.py | https://github.com/facebookresearch/detectron2/blob/main/detectron2/modeling/backbone/vit.py | 源码实现 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/MedSAM_Inference.py | MedSAM_for_PyTorch/extensions/seg_3dnii_sparse_marker/medsam_infer_3Dbox_adrenal.py | https://github.com/facebookresearch/segment-anything/blob/main/notebooks/predictor_example.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/segment_anything/build_sam.py | MedSAM_for_PyTorch/segment_anything/build_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth | 预训练模型 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/segment_anything/build_sam.py | MedSAM_for_PyTorch/segment_anything/build_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_b_01ec64.pth | 预训练模型 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/utils/README.md | MedSAM_for_PyTorch/utils/pre_CT_MR.py | https://radiopaedia.org/articles/windowing-ct | 相关说明 | -| 开发引入 | / | MedSAM_for_PyTorch/assets/architecture.png | http://www.w3.org/1999/02/22-rdf-syntax-ns# | 相关说明 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/setup.py | MedSAM_for_PyTorch/setup.py | https://github.com/facebookresearch/segment-anything | 源码实现 | -| 开发引入 | / | MedSAM_for_PyTorch/segment_anything/modeling/common.py | https://github.com/facebookresearch/ConvNeXt/blob/d1fa8f6fef0a165b27399986cc2bdacc92777e40/models/convnext.py#L119 | 源码实现 | -| 开源代码引入 | https://github.com/bowang-lab/MedSAM.git/segment_anything/build_sam.py | MedSAM_for_PyTorch/segment_anything/build_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_l_0b3195.pth | 预训练模型 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/extensions/point_prompt/tutorial_point_prompt_seg.ipynb | https://zenodo.org/record/7860267 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/extensions/text_prompt/tutorial_text_prompt_seg.ipynb | https://zenodo.org/record/7860267 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/extensions/text_prompt/tutorial_text_prompt_seg.ipynb | https://rumc-gcorg-p-public.s3.amazonaws.com/i/2022/03/29/20220309-FLARE22-Pictures-2.png | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/segment_anything/build_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_l_0b3195.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/segment_anything/build_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_h_4b8939.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/segment_anything/build_sam.py | https://dl.fbaipublicfiles.com/segment_anything/sam_vit_b_01ec64.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/MedSAM_for_PyTorch/tutorial_quickstart.ipynb | https://pytorch.org/get-started/locally/ | 下载依赖 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/PSPNet/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/PSPNet/public_address_statement.md index 55f08f7df37ed72f78d5effc6f0bf814be9b0c0b..9e64602b2f374f442eec5216858a0363a4f1f6d8 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/PSPNet/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/PSPNet/public_address_statement.md @@ -1,188 +1,80 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------------------------------------------------------------|----------------------------------------------|------------------------|----| -| 开发引入 | / | url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 下载测试图片 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 下载权重文件 | -| 开发引入 | / | PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 下载权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/setup.py | PSPNet/setup.py | openmmlab@gmail.com | 作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/setup.py | PSPNet/setup.py | http://github.com/open-mmlab/mmsegmentation | 开源地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/setup.py | PSPNet/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/ | PSPNet/setup.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/utils/self_attention_block.py | PSPNet/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/utils/make_divisible.py | PSPNet/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/necks/fpn.py | PSPNet/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/uper_head.py | PSPNet/mmseg/models/decode_heads/uper_head.py | https://arxiv.org/abs/1807.10221 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/sep_aspp_head.py | PSPNet/mmseg/models/decode_heads/sep_aspp_head.py | https://arxiv.org/abs/1802.02611 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/psp_head.py | PSPNet/mmseg/models/decode_heads/psp_head.py | https://arxiv.org/abs/1612.01105 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/point_head.py | PSPNet/mmseg/models/decode_heads/point_head.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend/point_head/point_head.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/ocr_head.py | PSPNet/mmseg/models/decode_heads/ocr_head.py | https://arxiv.org/abs/1909.11065 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/nl_head.py | PSPNet/mmseg/models/decode_heads/nl_head.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/lraspp_head.py | PSPNet/mmseg/models/decode_heads/lraspp_head.py | https://ieeexplore.ieee.org/document/9008835 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/gc_head.py | PSPNet/mmseg/models/decode_heads/gc_head.py | https://arxiv.org/abs/1904.11492 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/fpn_head.py | PSPNet/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/fcn_head.py | PSPNet/mmseg/models/decode_heads/fcn_head.py | https://arxiv.org/abs/1411.4038 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/enc_head.py | PSPNet/mmseg/models/decode_heads/enc_head.py | https://arxiv.org/abs/1803.08904 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/ema_head.py | PSPNet/mmseg/models/decode_heads/ema_head.py | https://arxiv.org/abs/1907.13426 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/dnl_head.py | PSPNet/mmseg/models/decode_heads/dnl_head.py | https://arxiv.org/abs/2006.06668 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/dm_head.py | PSPNet/mmseg/models/decode_heads/dm_head.py | https://openaccess.thecvf.com/content_ICCV_2019/papers/ | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/da_head.py | PSPNet/mmseg/models/decode_heads/da_head.py | https://arxiv.org/abs/1809.02983 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/cc_head.py | PSPNet/mmseg/models/decode_heads/cc_head.py | https://arxiv.org/abs/1811.11721 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/aspp_head.py | PSPNet/mmseg/models/decode_heads/aspp_head.py | https://arxiv.org/abs/1706.05587 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/apc_head.py | PSPNet/mmseg/models/decode_heads/apc_head.py | https://openaccess.thecvf.com/content_CVPR_2019/papers/ | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/decode_heads/ann_head.py | PSPNet/mmseg/models/decode_heads/ann_head.py | https://arxiv.org/abs/1908.07678 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/backbones/unet.py | PSPNet/mmseg/models/backbones/unet.py | https://arxiv.org/pdf/1505.04597.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/backbones/resnet.py | PSPNet/mmseg/models/backbones/resnet.py | https://arxiv.org/pdf/1812.01187.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/backbones/mobilenet_v3.py | PSPNet/mmseg/models/backbones/mobilenet_v3.py | https://ieeexplore.ieee.org/document/9008835 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/backbones/hrnet.py | PSPNet/mmseg/models/backbones/hrnet.py | https://arxiv.org/abs/1904.04514 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/models/backbones/cgnet.py | PSPNet/mmseg/models/backbones/cgnet.py | https://arxiv.org/abs/1811.08201 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/datasets/builder.py | PSPNet/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmseg/core/seg/sampler/ohem_pixel_sampler.py | PSPNet/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/runner/hooks/optimizer.py | PSPNet/mmcv_replace/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/runner/hooks/optimizer.py | PSPNet/mmcv_replace/runner/hooks/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/runner/hooks/momentum_updater.py | PSPNet/mmcv_replace/runner/hooks/momentum_updater.py | https://arxiv.org/pdf/1708.07120.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/runner/hooks/lr_updater.py | PSPNet/mmcv_replace/runner/hooks/lr_updater.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/runner/hooks/logger/mlflow.py | PSPNet/mmcv_replace/runner/hooks/logger/mlflow.py | https://www.mlflow.org/docs/latest/index.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/runner/fp16_utils.py | PSPNet/mmcv_replace/runner/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | https://github.com/deepcs233/TIN/blob/master/cuda_shift/rtc_wrap.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | shaoh19@mails.tsinghua.edu.cn | 作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | sjqian@cse.cuhk.edu.hk | 作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | yuliu@ee.cuhk.edu.hk | 作者邮箱 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/tin_shift.py | PSPNet/mmcv_replace/ops/tin_shift.py | https://github.com/mit-han-lab/temporal-shift-module | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/saconv.py | PSPNet/mmcv_replace/ops/saconv.py | https://arxiv.org/pdf/2006.02334.pdf | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/aoi_align.py | PSPNet/mmcv_replace/ops/aoi_align.py | https://github.com/facebookresearch/detectron2/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/psa_mask.py | PSPNet/mmcv_replace/ops/psa_mask.py | https://github.com/hszhao/semseg/blob/master/lib/psa | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/point_sample.py | PSPNet/mmcv_replace/ops/point_sample.py | https://github.com/facebookresearch/detectron2/tree/master/projects/PointRend | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/nms.py | PSPNet/mmcv_replace/ops/nms.py | https://github.com/pytorch/vision/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/roi_align_cpu.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/roi_align_cpu.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/psamask_cuda.cu | PSPNet/mmcv_replace/ops/csrc/pytorch/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/psamask_cuda.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/psamask_cuda.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/nms_rotated_cuda.cu | PSPNet/mmcv_replace/ops/csrc/pytorch/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/nms_rotated_cpu.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/nms_rotated_cpu.cpp | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/nms_rotated.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/info.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/vision.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/info.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/info.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/info.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/corner_pool.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/cc_attention_cuda.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/cc_attention_cuda.cpp | https://github.com/LikeLy-Journey/SegmenTron/blob/master/segmentron/modules/csrc/criss_cross_attention/ca_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cuda.cu | PSPNet/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cpu.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/pytorch/box_iou_rotated.cpp | PSPNet/mmcv_replace/ops/csrc/pytorch/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/roi_align_cpu.cpp | PSPNet/mmcv_replace/ops/csrc/parrots/roi_align_cpu.cpp | https://github.com/facebookresearch/detectron2/tree/master/detectron2/layers/csrc/ROIAlign | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/nms_rotated_cuda.cu | PSPNet/mmcv_replace/ops/csrc/parrots/nms_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/psamask_cuda.cu | PSPNet/mmcv_replace/ops/csrc/parrots/psamask_cuda.cu | https://github.com/hszhao/semseg/blob/master/lib/psa/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/nms_rotated.cpp | PSPNet/mmcv_replace/ops/csrc/parrots/nms_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/corner_pool.cpp | PSPNet/mmcv_replace/ops/csrc/parrots/corner_pool.cpp | https://github.com/princeton-vl/CornerNet-Lite/tree/master/core/models/py_utils/_cpools/src | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cuda.cu | PSPNet/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cuda.cu | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/box_iou_rotated.cpp | PSPNet/mmcv_replace/ops/csrc/parrots/box_iou_rotated.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cpu.cpp | PSPNet/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cpu.cpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cpu.cpp | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/nms_rotated_cuda.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/nms_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/nms_rotated/nms_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/modulated_deform_conv_cuda_kernel.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/modulated_deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/modulated_deform_conv_cuda_kernel.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/modulated_deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/deform_conv_cuda_kernel.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/deform_conv_cuda_kernel.cuh | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/deform_conv_cuda_kernel.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/deform_conv_cuda_kernel.cuh | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/carafe_cuda_kernel.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/carafe_cuda_kernel.cuh | https://devblogs.nvidia.com/efficient-matrix-transpose-cuda-cc/ | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/box_iou_rotated_utils.hpp | PSPNet/mmcv_replace/ops/csrc/parrots/box_iou_rotated_utils.hpp | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_utils.h | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cuda.cuh | PSPNet/mmcv_replace/ops/csrc/parrots/box_iou_rotated_cuda.cuh | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/csrc/box_iou_rotated/box_iou_rotated_cuda.cu | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/corner_pool.py | PSPNet/mmcv_replace/ops/corner_pool.py | https://arxiv.org/abs/1808.01244 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/corner_pool.py | PSPNet/mmcv_replace/ops/corner_pool.py | https://github.com/princeton-vl/CornerNet-Lite | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/ops/carafe.py | PSPNet/mmcv_replace/ops/carafe.py | https://arxiv.org/abs/1905.02188 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/symbolic.py | PSPNet/mmcv_replace/onnx/symbolic.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/simplify/core.py | PSPNet/mmcv_replace/onnx/simplify/core.py | https://github.com/daquexian/onnx-simplifier | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/simplify/core.py | PSPNet/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/blob/e5e9a539f550f07ec156812484e8d4f33fb91f88/onnx/onnx.proto#L461 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/simplify/core.py | PSPNet/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2417 | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/simplify/core.py | PSPNet/mmcv_replace/onnx/simplify/core.py | https://github.com/onnx/onnx/issues/2613 | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | PSPNet/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | PSPNet/mmcv_replace/onnx/onnx_utils/symbolic_helper.py | https://github.com/pytorch/pytorch/blob/75ee5756715e7161314ce037474843b68f69fc04/torch/onnx/symbolic_helper.py#L375 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/image/io.py | PSPNet/mmcv_replace/image/io.py | https://github.com/lilohuang/PyTurboJPEG | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/image/colorspace.py | PSPNet/mmcv_replace/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#ITU-R_BT.601_conversion | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/image/colorspace.py | PSPNet/mmcv_replace/image/colorspace.py | https://en.wikipedia.org/wiki/YCbCr#JPEG_conversion | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/utils/weight_init.py | PSPNet/mmcv_replace/cnn/utils/weight_init.py | http://proceedings.mlr.press/v9/glorot10a/glorot10a.pdf | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/utils/weight_init.py | PSPNet/mmcv_replace/cnn/utils/weight_init.py | https://www.cv-foundation.org/openaccess/content_iccv_2015/ | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/utils/weight_init.py | PSPNet/mmcv_replace/cnn/utils/weight_init.py | http://download.openmmlab.com/mmdetection/v2.0/retinanet/ | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/utils/flops_counter.py | PSPNet/mmcv_replace/cnn/utils/flops_counter.py | https://github.com/sovrasov/flops-counter.pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/wrappers.py | PSPNet/mmcv_replace/cnn/bricks/wrappers.py | https://github.com/facebookresearch/detectron2/blob/master/detectron2/layers/wrappers.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/plugin.py | PSPNet/mmcv_replace/cnn/bricks/plugin.py | https://inflection.readthedocs.io/en/latest/#inflection.underscore | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/non_local.py | PSPNet/mmcv_replace/cnn/bricks/non_local.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/non_local.py | PSPNet/mmcv_replace/cnn/bricks/non_local.py | https://github.com/AlexHex7/Non-local_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/generalized_attention.py | PSPNet/mmcv_replace/cnn/bricks/generalized_attention.py | https://arxiv.org/abs/1711.07971 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/depthwise_separable_conv_module.py | PSPNet/mmcv_replace/cnn/bricks/depthwise_separable_conv_module.py | https://arxiv.org/pdf/1704.04861.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/conv_ws.py | PSPNet/mmcv_replace/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/1903.10520.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/conv_ws.py | PSPNet/mmcv_replace/cnn/bricks/conv_ws.py | https://arxiv.org/pdf/2006.02334.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation/blob/9f071cade8cdc59c13b416c7c9843005410c055c/mmcv_replace/cnn/bricks/context_block.py | PSPNet/mmcv_replace/cnn/bricks/context_block.py | https://arxiv.org/abs/1904.11492 | 论文地址 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_imagenet-01ecd97e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_imagenet-9ad3945d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_imagenet-91b6d117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_imagenet-fee352a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg11_bn_imagenet-6fbbbf3f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg13_bn_imagenet-4b5f9390.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg16_bn_imagenet-3ac6d8fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/vgg/vgg19_bn_imagenet-7c058385.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet18_batch256_imagenet_20200708-34ab8f90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet34_batch256_imagenet_20200708-32ffb4f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_batch256_imagenet_20200708-cfb998bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet101_batch256_imagenet_20200708-753f3608.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnet152_batch256_imagenet_20200708-ec25b1f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d50_batch256_imagenet_20200708-1ad0ce94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d101_batch256_imagenet_20200708-9cb302ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnet/resnetv1d152_batch256_imagenet_20200708-e79cb6a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext50_32x4d_batch256_imagenet_20200708-c07adbb7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x4d_batch256_imagenet_20200708-87f2d1c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext101_32x8d_batch256_imagenet_20200708-1ec34aa7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnext/resnext152_32x4d_batch256_imagenet_20200708-aab5034c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet50_batch256_imagenet_20200804-ae206104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/se-resnet/se-resnet101_batch256_imagenet_20200804-ba5b51d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest50_imagenet_converted-1ebf0afe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest101_imagenet_converted-032caa52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest200_imagenet_converted-581a60f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/resnest/resnest269_imagenet_converted-59930960.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v1/shufflenet_v1_batch1024_imagenet_20200804-5d6cec73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/shufflenet_v2/shufflenet_v2_batch1024_imagenet_20200812-5bf4721e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/mmcls.json | https://download.openmmlab.com/mmclassification/v0/mobilenet_v2/mobilenet_v2_batch256_imagenet_20200708-3b2dc3af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/vgg16_caffe-292e1171.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_caffe-788b5fa3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_msra-5891d200.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_caffe-3ad79236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_msra-6cc46731.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x8d-1516f1aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50-32x4d-0ab1a123.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d-a5af3160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_64x4d-ee2c6f71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_thangvubk-ad1730dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn-9186a21c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn-cac0ab98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_gn_ws-15beedd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_gn_ws-3e3c308c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn_ws-0d87ac85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn_ws-34ac1a9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext50_32x4d_gn-c7e8b754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnext101_32x4d_gn-ac3bb84e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18_small-b5a04e21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w18-00eb2006.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w32-dc9eeb4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w40-ed0b031c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/hrnetv2_w48-d2186c55.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/bn_inception_caffe-ed2e8665.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/i3d_r50_f32s2_k400-2c57e077.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/nl3d_r50_f32s2_k400-fa7e7caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/res2net101_v1d_26w_4s_mmdetv2-f0a600f9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_400mf-a5b10d96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_800mf-1f4be4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_1.6gf-5791c176.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_3.2gf-c2599b0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_4.0gf-a88f671e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_6.4gf-006af45d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_8.0gf-3c68abe7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/regnetx_12gf-4c2a3350.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet18_v1c-b5776b93.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet50_v1c-2cccc1ad.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnet101_v1c-e67eebb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/vgg_state_dict.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/mmediting/third_party/model_best_resnet34_En_nomixup.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | mmedit/mobilenet_v2": "https://download.openmmlab.com/mmediting/third_party/mobilenet_v2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_large-bc2c3fd3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/mobilenet_v3_small-47085aa1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest50_d2-7497a55b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest101_d2-f3b931b2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/resnest200_d2-ca88e41f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/mmcv_replace/model_zoo/open_mmlab.json | https://download.openmmlab.com/pretrain/third_party/darknet53-a628ea1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PSPNet/url.ini | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/PointRend/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/PointRend/public_address_statement.md index b3eabe170d62d1aaf7a3ac236d4976d5ca745336..ed12ce6bd3499d6fcbe6d88ea1c4a65d29f85d94 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/PointRend/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/PointRend/public_address_statement.md @@ -1,160 +1,15 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|----------------------------------------------------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906383/RegNetX-4.0GF_dds_8gpu.pyth | 下载预训练权重 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth | 下载预训练权重 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/checkpoint/catalog.py | https://dl.fbaipublicfiles.com/detectron | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/collect_env.py | file:///tmp/nccl_tmp_file | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/file_io.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/testing.py | http://images.cocodataset.org/train2017/000000000009.jpg | 下载图片 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/deploy.Dockerfile | https://github.com/protocolbuffers/protobuf/releases/download/v3.11.4/protobuf-cpp-3.11.4.tar.gz | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/deploy.Dockerfile | https://github.com/pytorch/vision/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/Dockerfile | https://download.pytorch.org/whl/cu111/torch_stable.html | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/Dockerfile | https://github.com/facebookresearch/fvcore | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/Dockerfile | https://github.com/facebookresearch/detectron2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | https://github.com/facebookresearch/detectron2/blob/master/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | https://docs.python.org/3.6 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | https://docs.scipy.org/doc/numpy/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | https://pytorch.org/docs/master/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | https://arxiv.org/abs/ | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp37-cp37m-linux_x86_64.whl | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torchvision-0.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/setup.py | https://github.com/facebookresearch/detectron2 | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/setup.py | https://github.com/cocodataset/panopticapi/archive/master.zip | 下载依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/data/test_coco_evaluation.py | http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg | 下载图片 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/test_model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/Misc/scratch_mask_rcnn_R_50_FPN_3x_gn/138602908/model_final_01ca85.pkl | 下载预训练权重 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tools/deploy/export_model.py | https://github.com/pytorch/pytorch/issues/46944 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/test_export_torchscript.py | https://detectron2.readthedocs.io/tutorials/deployment.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/test_export_torchscript.py | https://github.com/pytorch/pytorch/issues/46944 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/layers/test_roi_align.py | https://github.com/tensorflow/tensorflow/issues/26278 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/data/test_detection_utils.py | https://github.com/recurser/exif-orientation-examples/raw/master/Landscape_5.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000285.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/data/test_coco_evaluation.py | http://farm8.staticflickr.com/7434/9138147604_c6225224b8_z.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/tests/data/test_coco_evaluation.py | http://images.cocodataset.org/val2017/000000000139.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/setup.py | https://github.com/pytorch/pytorch/pull/43931 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/setup.py | https://github.com/ppwwyyxx/cocoapi | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/setup.py | https://pypi.org/project/{name}/#files | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/setup.py | https://github.com/skvark/opencv-python | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/projects/PointRend/point_rend/color_augmentation.py | https://github.com/weiliu89/caffe/blob/4817bf8b4200b35ada8ed0dc378dceaf38c539e4/src/caffe/util/im_transforms.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/projects/PointRend/point_rend/color_augmentation.py | https://github.com/chainer/chainercv/blob/7159616642e0be7c5b3ef380b848e16b7e99355b/chainercv/links/model/ssd/transforms.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | http://www.sphinx-doc.org/en/master/config | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docs/conf.py | https://github.com/readthedocs/recommonmark/blob/ddd56e7717e9745f11300059e4268e204138a6b1/recommonmark/parser.py#L152-L155 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/Dockerfile | https://pytorch.org/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/docker/Dockerfile | http://images.cocodataset.org/val2017/000000439715.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/dev/packaging/pkg_helpers.bash | https://github.com/pytorch/pytorch/blob/master/torch/utils/cpp_extension.py#L1363 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/dev/packaging/build_wheel.sh | https://github.com/NVIDIA/nvidia-docker/issues/854 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/visualizer.py | https://github.com/matplotlib/matplotlib/issues/15363 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/visualizer.py | https://stackoverflow.com/questions/8919719/how-to-plot-a-complex-polygon | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/serialize.py | https://github.com/joblib/joblib/blob/master/joblib/externals/loky/cloudpickle_wrapper.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/logger.py | https://github.com/abseil/abseil-py/blob/master/absl/logging/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/env.py | https://stackoverflow.com/questions/67631/how-to-import-a-module-given-the-full-path | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/utils/env.py | https://github.com/python-trio/trio/blob/6754c74eacfad9cc5c92d5c24727a2f3b620624e/trio/_util.py#L216-L241 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/structures/masks.py | https://stackoverflow.com/questions/24467972/calculate-area-of-polygon-given-x-y-coordinates | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/structures/keypoints.py | https://github.com/pytorch/pytorch/issues/44768 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/structures/boxes.py | https://github.com/pytorch/pytorch/issues/18627 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/structures/boxes.py | https://github.com/kuangliu/torchcv/blob/master/torchcv/utils/box.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/solver/build.py | https://github.com/facebookresearch/detr/pull/287 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/projects/__init__.py | https://github.com/pypa/setuptools/issues/230 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/proposal_generator/rrpn.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/proposal_generator/proposal_utils.py | https://github.com/pytorch/pytorch/issues/47379 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/proposal_generator/proposal_utils.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/proposal_generator/proposal_utils.py | https://github.com/facebookresearch/Detectron/issues/459 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/poolers.py | https://github.com/pytorch/pytorch/issues/41412 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/mmdet_wrapper.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/two_stage.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/mmdet_wrapper.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/backbones/resnet.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/mmdet_wrapper.py | https://github.com/open-mmlab/mmdetection/tree/master/configs/_base_/datasets | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/mmdet_wrapper.py | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/models/detectors/base.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/meta_arch/semantic_seg.py | https://github.com/pytorch/pytorch/issues/48163 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/backbone/regnet.py | https://github.com/facebookresearch/pycls | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/backbone/fpn.py | https://github.com/pytorch/pytorch/issues/47336 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/modeling/anchor_generator.py | https://github.com/facebookresearch/Detectron/issues/227 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/model_zoo/__init__.py | https://github.com/facebookresearch/detectron2/blob/master/MODEL_ZOO.md | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/12013 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/40507 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/wrappers.py | https://github.com/pytorch/pytorch/issues/38718 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/roi_align.py | https://github.com/pytorch/vision/pull/2438 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/cuda/detail/CUDAHooks.cpp#L231 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/vision.cpp | https://github.com/pytorch/pytorch/blob/master/aten/src/ATen/Version.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda_kernel.cu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/deformable/deform_conv_cuda_kernel.cu | https://arxiv.org/abs/1703.06211 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/open-mmlab/mmdetection/blob/master/mmdet/ops/dcn/src/deform_conv_cuda.cpp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/csrc/deformable/deform_conv_cuda.cu | https://github.com/chengdazhi/Deformable-Convolution-V2-PyTorch/blob/mmdetection/mmdet/ops/dcn/src/deform_conv_cuda.c | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/blob/master/torch/nn/modules/batchnorm.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/batch_norm.py | https://github.com/pytorch/pytorch/pull/36382 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/layers/aspp.py | https://github.com/tensorflow/models/blob/21b73d22f3ed05b650e85ac50849408dd36de32e/research/deeplab/model.py#L532 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/export/torchscript_patch.py | https://github.com/pytorch/pytorch/issues/38964 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/export/torchscript_patch.py | https://pytorch.org/docs/stable/jit_language_reference.html#optional-type-refinement | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/export/torchscript_patch.py | https://github.com/pytorch/pytorch/issues/36061 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/export/torchscript.py | https://pytorch.org/docs/stable/jit.html#inspecting-code | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/export/shared.py | https://www.geeksforgeeks.org/find-paths-given-source-destination/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/export/api.py | https://github.com/lutzroeder/netron | 相关工具 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#stuff-eval | 数据集链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/sem_seg_evaluation.py | http://cocodataset.org/#format-results | 数据集链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/pascal_voc_evaluation.py | https://github.com/rbgirshick/py-faster-rcnn/blob/master/lib/datasets/voc_eval.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/lvis_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#detection-eval | 数据集链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/coco_evaluation.py | https://arxiv.org/pdf/2102.01066.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L222-L252 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/coco_evaluation.py | https://github.com/facebookresearch/Detectron/blob/a6a835f5b8208c45d0dce217ce9bbda915f44df7/detectron/datasets/json_dataset_evaluator.py#L255 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalInstanceLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/evaluation/cityscapes_evaluation.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/evalPixelLevelSemanticLabeling.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/train_loop.py | http://engineering.hearsaysocial.com/2013/06/16/circular-references-in-python/ | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/train_loop.py | https://arxiv.org/abs/2006.15704 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/launch.py | https://github.com/pytorch/pytorch/pull/14391 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/launch.py | https://github.com/facebookresearch/maskrcnn-benchmark/issues/172 | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/ddp_comm_hooks.html#torch.distributed.algorithms.ddp_comm_hooks.default_hooks.fp16_compress_hook | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/engine/defaults.py | https://github.com/sphinx-doc/sphinx/issues/4258 | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/transform.py | https://detectron2.readthedocs.io/tutorials/augmentation.html | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/latest/PIL.html#PIL.ImageTransform.ExtentTransform | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/transform.py | https://github.com/opencv/opencv/issues/11784 | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/transform.py | https://pillow.readthedocs.io/en/stable/ | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/augmentation_impl.py | https://github.com/tensorflow/tpu/blob/master/models/official/detection/utils/input_utils.py#L127 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/augmentation_impl.py | https://pillow.readthedocs.io/en/3.0.x/reference/ImageEnhance.html | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/transforms/augmentation.py | https://detectron2.readthedocs.io/tutorials/augmentation.html | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/detection_utils.py | https://en.wikipedia.org/wiki/YUV#SDTV_with_BT.601 | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/detection_utils.py | hhttps://www.exiv2.org/tags.html | 相关参考 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/issues/3973 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/detection_utils.py | https://github.com/wkentaro/labelme/blob/v4.5.4/labelme/utils/image.py#L59 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/detection_utils.py | https://github.com/python-pillow/Pillow/blob/7.1.2/src/PIL/ImageOps.py#L527 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/detection_utils.py | https://github.com/facebookresearch/detectron2/issues/1885 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/lvis.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/lvis.py | http://images.cocodataset.org/train2017/000000155379.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/coco.py | http://farm6.staticflickr.com/5454/9413846304_881d5e5c3b_z.jpg | 图片链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/en/latest/tutorials/datasets.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/tutorials/datasets.html#register-a-dataset | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/coco.py | http://cocodataset.org/#format-data | 数据集链接 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/coco.py | https://github.com/facebookresearch/detectron2/pull/175#issuecomment-551202163 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/preparation/json2instanceImg.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/cityscapes.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/evaluation/instances2dict.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/builtin_meta.py | https://github.com/cocodataset/panopticapi/blob/master/panoptic_coco_categories.json | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/data/datasets/builtin_meta.py | https://github.com/mcordts/cityscapesScripts/blob/master/cityscapesscripts/helpers/labels.py | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/lazy.py | https://github.com/omry/omegaconf/issues/784 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/lazy.py | https://github.com/open-mmlab/mmcv/blob/master/mmcv/utils/config.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/lazy.py | https://hydra.cc/docs/next/advanced/override_grammar/basic/ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/instantiate.py | https://github.com/facebookresearch/hydra/issues/1200 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/defaults.py | https://detectron2.readthedocs.io/en/latest/tutorials/lazyconfigs.html | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/defaults.py | https://pillow.readthedocs.io/en/stable/handbook/concepts.html#concept-modes | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/detectron2/config/defaults.py | https://arxiv.org/abs/1811.11168 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/datasets/prepare_cocofied_lvis.py | https://github.com/lvis-dataset/lvis-api/blob/master/data/coco_to_synset.json | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/new_baselines/mask_rcnn_regnetx_4gf_dds_FPN_100ep_LSJ.py | https://github.com/facebookresearch/detectron2/blob/master/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/new_baselines/mask_rcnn_regnety_4gf_dds_FPN_100ep_LSJ.py | https://github.com/facebookresearch/detectron2/blob/master/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py | https://github.com/pytorch/pytorch/issues/36530 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py | https://github.com/tensorflow/tpu/blob/b24729de804fdb751b06467d3dce0637fa652060/models/official/detection/modeling/architecture/heads.py#L95-L97 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/new_baselines/mask_rcnn_R_50_FPN_100ep_LSJ.py | https://github.com/tensorflow/tpu/blob/b24729de804fdb751b06467d3dce0637fa652060/models/official/detection/utils/input_utils.py#L127 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py | https://github.com/facebookresearch/pycls/blob/2c152a6e5d913e898cca4f0a758f41e6b976714d/configs/dds_baselines/regnety/RegNetY-4.0GF_dds_8gpu.yaml#L4-L10 | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth | 模型权重 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py | https://github.com/facebookresearch/pycls/blob/2c152a6e5d913e898cca4f0a758f41e6b976714d/configs/dds_baselines/regnetx/RegNetX-4.0GF_dds_8gpu.yaml#L4-L9 | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/detectron2 | PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906383/RegNetX-4.0GF_dds_8gpu.pyth | 模型权重 | -| 开发引入 | / | PointRend/docs/requirements.txt | git://github.com/facebookresearch/fvcore.git | 相关依赖 | -| 开发引入 | / | PointRend/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp37-cp37m-linux_x86_64.whl | 相关依赖 | -| 开发引入 | / | PointRend/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torchvision-0.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl | 相关依赖 | -| 开发引入 | / | PointRend/tools/deploy/CMakeLists.txt | https://pytorch.org/tutorials/advanced/cpp_frontend.html | 相关依赖 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/en/latest/tutorials/datasets.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 数据集详情 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/detectron2/utils/file_io.py | https://dl.fbaipublicfiles.com/detectron2 | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/detectron2/utils/testing.py | http://images.cocodataset.org/train2017/000000000009.jpg | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PointRend/docker/Dockerfile | https://download.pytorch.org/whl/cu111/torch_stable.html | 三方库地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/PraNet/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/PraNet/public_address_statement.md index 1db8df0e3e54d0e0b6a6fd651bdff7eb4d136e59..5625f3b21254c37cb112a08c5d7fe564742b37fb 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/PraNet/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/PraNet/public_address_statement.md @@ -1,5 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|----------------------------------------------------------------------------| ------------------------------------ |---------| -| 开源代码引入 | https://github.com/DengPingFan/PraNet | PraNet/lib/Res2Net_v1b.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_v1b_26w_4s-3cf99910.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/DengPingFan/PraNet | PraNet/lib/Res2Net_v1b.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net101_v1b_26w_4s-0812c246.pth | 下载预训练权重 | -| 开源代码引入 | https://github.com/DengPingFan/PraNet | PraNet/utils/utils.py | https://github.com/Lyken17/pytorch-OpCounter | 相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PraNet/lib/Res2Net_v1b.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_v1b_26w_4s-3cf99910.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/PraNet/lib/Res2Net_v1b.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net101_v1b_26w_4s-0812c246.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/RefineNet/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/RefineNet/public_address_statement.md index 1a172755275ce4dc2a245ccbc7569413071fdd90..8950d9844b20953ebb3a1574eeed067e30db4b5a 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/RefineNet/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/RefineNet/public_address_statement.md @@ -1,4 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------ | -------------- | -------- | ------------------ | ----------- | -| 开源代码引入 | https://github.com/DrSleep/refinenet-pytorch | RefineNet/models/refinenet.py | https://cloudstor.aarnet.edu.au/plus/s/Owmttk9bdPROwc6/download | 下载模型初始化时的权重文件 | -| 开源代码引入 | https://github.com/DrSleep/refinenet-pytorch | RefineNet/models/refinenet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载模型初始化时的权重文件 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------|-----------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/RefineNet/models/refinenet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/RefineNet/models/refinenet.py | https://cloudstor.aarnet.edu.au/plus/s/Owmttk9bdPROwc6/download | 下载链接 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/SETR/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/SETR/public_address_statement.md index 1d7c8dec65a3a114b3d46947a0dedc4d37275632..5f7d2874eafc9b22b963534032913a94d254f593 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/SETR/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/SETR/public_address_statement.md @@ -1,25 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|------------------------|--------| -| 开源代码引入 | https://github.com/fudan-zvg/SETR/setup.py | SETR/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 相关依赖 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 预训练模型 | -| 开发引入 | / | SETR/tools/analyze_logs.py | https://github.com/open- | 源码实现 | -| 开发引入 | / | SETR/demo.py | https://bbs-img.huaweicloud.com/blogs/img/thumb/1591951315139_8989_1363.png | 图片地址 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_base_p16_224-4e355ebd.pth | 预训练模型 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/datasets/builder.py | SETR/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 预训练模型 | -| 开发引入 | / | SETR/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 相关说明 | -| 开发引入 | / | SETR/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 论文地址 | -| 开发引入 | / | SETR/tools/accuracy_comparision.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/docker/Dockerfile | SETR/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | 相关依赖 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 预训练模型 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/setup.py | SETR/setup.py | openmmlab@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/decode_heads/vit_up_head.py | SETR/mmseg/models/decode_heads/vit_up_head.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/decode_heads/vit_up_head.py | SETR/mmseg/models/backbones/vit.py | https://people.sc.fsu.edu/~jburkardt/presentations/truncated_normal.pdf | 论文地址 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 预训练模型 | -| 开发引入 | / | SETR/tools/train.py | https://github.com/NVIDIA/apex/tree/master/examples/imagenet | 源码实现 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/mmseg/models/backbones/vit_mla.py | SETR/mmseg/models/backbones/vit.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 预训练模型 | -| 开源代码引入 | https://github.com/fudan-zvg/SETR/setup.py | SETR/setup.py | http://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开发引入 | / | SETR/mmcv-need/fp16_utils.py | https://github.com/NVIDIA/apex/blob/master/apex/fp16_utils/loss_scaler.py | 源码实现 | -| 开发引入 | / | SETR/docker/Dockerfile | https://github.com/open-mmlab/mmsegmenation.git | 源码实现 | -| 开发引入 | / | SETR/mmcv-need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/SETR/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/SETR/mmseg/models/backbones/vit.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/SETR/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/semantic_segmentation/SeMask/public_address_statement.md b/PyTorch/contrib/cv/semantic_segmentation/SeMask/public_address_statement.md index c1142816c0f317e083bdcc3dc8cdc153d41d8b9e..a8c1ffe9784eedf54dab15d5ae933255adac5792 100644 --- a/PyTorch/contrib/cv/semantic_segmentation/SeMask/public_address_statement.md +++ b/PyTorch/contrib/cv/semantic_segmentation/SeMask/public_address_statement.md @@ -1,17 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/decode_heads/branch_fpn_head.py|SeMask/mmseg/models/decode_heads/branch_fpn_head.py | https://arxiv.org/abs/1901.02446 | 引用参考论文地址 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FAPN/SeMask-Mask2Former/mask2former/modeling/backbone/semask_swin.py|SeMask/mmseg/models/backbones/swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 引用参考论文地址 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/decode_heads/branch_fpn_head.py|SeMask/mmseg/models/decode_heads/fpn_head.py | https://arxiv.org/abs/1901.02446 | 引用参考论文地址 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/losses/lovasz_loss.py|SeMask/mmseg/models/losses/lovasz_loss.py | https://github.com/bermanmaxim/LovaszSoftmax/blob/master/pytor | 源码实现 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FAPN/SeMask-Mask2Former/mask2former/modeling/backbone/semask_swin.py|SeMask/mmseg/models/backbones/semask_swin_transformer.py | https://arxiv.org/pdf/2103.14030 | 引用参考论文地址 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/losses/lovasz_loss.py|SeMask/mmseg/models/losses/lovasz_loss.py | https://arxiv.org/abs/1705.08790 | 模型相关说明 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/necks/fpn.py|SeMask/mmseg/models/necks/fpn.py | https://arxiv.org/abs/1612.03144 | 引用参考论文地址 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/segmentors/base.py|SeMask/mmseg/models/segmentors/base.py | https://github.com/open-mmlab/mmdetection/issues/5844 | 模型相关说明 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/utils/make_divisible.py|SeMask/mmseg/models/utils/make_divisible.py | https://github.com/tensorflow/models/blob/master/research/slim/nets/mobilenet/mobilenet.py | 源码实现 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/models/utils/self_attention_block.py|SeMask/mmseg/models/utils/self_attention_block.py | https://arxiv.org/abs/1706.03762 | 引用参考论文地址 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/core/seg/sampler/ohem_pixel_sampler.py|SeMask/mmseg/core/seg/sampler/ohem_pixel_sampler.py | https://github.com/pytorch/pytorch/issues/22812 | 模型相关说明 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/setup.py|SeMask/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 模型相关说明 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/setup.py|SeMask/setup.py | openmmlab@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/setup.py|SeMask/setup.py | http://github.com/open-mmlab/mmsegmentation | 源码实现 | -| 开源代码引入 | https://github.com/Picsart-AI-Research/SeMask-Segmentation/blob/main/SeMask-FPN/mmseg/datasets/builder.py|SeMask/mmseg/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------|---------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/semantic_segmentation/SeMask/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/BSN/public_address_statement.md b/PyTorch/contrib/cv/video/BSN/public_address_statement.md index 6d21d4a55bb795d69e1db23e36884b4a3c372ac0..a8e3e1c716fa924e58c88fac56864590c56c1823 100644 --- a/PyTorch/contrib/cv/video/BSN/public_address_statement.md +++ b/PyTorch/contrib/cv/video/BSN/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|---------|------------------------|--------| -| 开源代码引入 | https://github.com/wzmsltw/BSN-boundary-sensitive-network/Evaluation/utils.py | BSN/Evaluation/utils.py | http://ec2-52-11-11-89.us-west-2.compute.amazonaws.com/challenge16/api.py | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------|---------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/BSN/Evaluation/utils.py | http://ec2-52-11-11-89.us-west-2.compute.amazonaws.com/challenge16/api.py | 源码实现 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/C3D/public_address_statement.md b/PyTorch/contrib/cv/video/C3D/public_address_statement.md index 2a643f5e3c2bad017bebd8029d51d72aca54806a..1d0d86c9075ed373c408d709081a8f8ffdbc7e22 100644 --- a/PyTorch/contrib/cv/video/C3D/public_address_statement.md +++ b/PyTorch/contrib/cv/video/C3D/public_address_statement.md @@ -1,615 +1,517 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------ | -------------- | -------- | --------------------- |----------- | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20200803-fc66ce8d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb_20200812-9037a758.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-d565828d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-c3be9793.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-3367437a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-5c933ae1.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_ig65m_pretrained_r50_32x2x1_58e_kinetics400_rgb_20210617-86d33018.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-b9b10241.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/c3d_sports1m_16x1x1_45e_ucf101_rgb_20201021-26655025.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/i3d_r50_32x2x1_100e_kinetics400_rgb_20200614-c25ef9a4.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_256p_32x2x1_100e_kinetics400_rgb_20200801-7d9f44de.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/i3d_r50_video_32x2x1_100e_kinetics400_rgb_20200826-e31c6f52.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_32x2x1_100e_kinetics400_rgb_20200616-2bbb4361.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb_20200725-24eb54cc.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_32x2x1_100e_kinetics400_rgb_20200612-000e4d2a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb_20200817-4e90d1d5.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200813-6e6aef1b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log.json| 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200815-17f84aa2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb_20200814-7c30d5bb.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030-b4eaf92b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030-23966b4b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030-66f5e046.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030-011f984d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030-59f5d064.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030-0f56ef51.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030-168eb098.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030-7da6dfc3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030-c36616e9.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030-e2890e8d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030-62974bac.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030-284cfd3b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/20200728_021421.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/20200728_021421.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/20200724_201360.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/20200724_201360.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb_20200826-ab35a529.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_8x8_69.58_88.36.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r21d_8x8.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_8x8x1_180e_kinetics400_rgb_20200618-3fce5629.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r2plus1d_r34_32x2_74.6_91.6.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r21d_32x2.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r2plus1d_r34_32x2x1_180e_kinetics400_rgb_20200618-63462eb3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/20200731_151706.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/20200731_151706.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb_20200728-145f1097.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/20200812_160237.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/20200812_160237.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/slowfast_r50_video_4x16x1_256e_kinetics400_rgb_20200826-f85b90c5.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/20200704_232901.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/20200704_232901.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/slowfast_r50_4x16x1_256e_kinetics400_rgb_20200704-bcde7ed7.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/20200731_151537.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/20200731_151537.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb_20200810-863812c2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/20200716_192653.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/20200716_192653.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/20210118_133528.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/20210118_133528.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/slowfast_r101_4x16x1_256e_kinetics400_rgb_20210218-d8b58813.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/20210218_121513.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/20210218_121513.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/slowfast_r101_8x8x1_256e_kinetics400_rgb_20210218-0dd54025.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/20210122_131321.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/20210122_131321.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/slowfast_r152_4x16x1_256e_kinetics400_rgb_20210122-bdeb6b87.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/20210606_225114.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/20210606_225114.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/slowfast_r50_16x8x1_22e_sthv1_rgb_20210630-53355c16.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_without_omni_8x8x1_kinetics400_rgb_20200926-0c730aef.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/20200817_001411.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/20200817_001411.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb_20200820-bea7701f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014-c9cdc656.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/20200817_003320.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/20200817_003320.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb_20200820-75851a7d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/slowonly_r50_4x16_73.02_90.77.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/so_4x16.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/slowonly_r50_4x16x1_256e_kinetics400_rgb_20200704-a69556c6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8_74.93_91.92.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/so_8x8.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912-1e8fc736.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912-3f9ce182.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/20210305_152630.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/20210305_152630.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb_20210308-0d6e5a69.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/20210308_212250.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/20210308_212250.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb_20210308-e8dd9e82.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_61.8_83.6.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_61.8_83.6.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_20200704-decb8568.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_196e_kinetics400_flow_65.8_86.3.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_196e_kinetics400_flow_65.8_86.3.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_256e_kinetics400_flow_20200704-6b384243.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015-81e5153e.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015-9250f662.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111-a9c34b54.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb-b56a5389.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/20210605_185256.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/20210605_185256.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb_20210630-16faeb6a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/20210606_010153.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/20210606_010153.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb_20210630-cee5f725.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/20210605_213503.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/20210605_213503.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb_20210630-181e1661.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/20210606_010231.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/20210606_010231.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb_20210630-ee8c850f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/20210605_235410.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/20210605_235410.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb_20210630-807a9a9a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219-032c8e94.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/20210606_205006.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/20210606_205006.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/tanet_r50_1x1x8_50e_sthv1_rgb_20210630-f4a48609.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/tanet_r50_1x1x8_50e_sthv1_rgb_20210630-f4a48609.pth | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/20210607_155335.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/tanet_r50_1x1x16_50e_sthv1_rgb_20210630-7c19303c.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log.json | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/20200809_142447.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/20200809_142447.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb_20200810-4a146a70.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/20200910_134330.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/20200910_134330.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb_20200910-b796d7a0.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/20200923_151919.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/20200923_151919.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb_20200923-52629684.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log.json | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/trn_r50_1x1x8_50e_sthv2_rgb_20210401-773eca7b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20200607_211800.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20200607_211800.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/20200725_031623.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/20200725_031623.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20210616_021451.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20210616_021451.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20210701-68d582b4.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219-bf96e6cc.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_2d_1x1x8_50e_kinetics400_rgb.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_2d_1x1x8_50e_kinetics400_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_1x1x8_100e_kinetics400_rgb_20200702-a77f4328.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/20210617_103245.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/20210617_103245.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/tsm_r50_dense_1x1x8_50e_kinetics400_rgb_20210701-a54ff3d3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/20210613_034931.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/20210613_034931.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/tsm_r50_dense_1x1x8_100e_kinetics400_rgb_20210701-e3e5e97f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/20201010_224825.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/20201010_224825.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20210621_115844.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20210621_115844.log | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/tsm_r50_1x1x16_50e_kinetics400_rgb_20210701-7c0c5d54.pth | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200724_120023.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200724_120023.log | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb_20200724-f00f1336.pth | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200815_210253.log.jso | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200815_210253.log | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb_20200816-b93fd297.pth | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/20200723_220442.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/20200723_220442.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb_20200724-d8ad84d2.pth | 源码模型权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/20210129_024936.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/20210129_024936.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/tsm_mobilenetv2_dense_320p_1x1x8_100e_kinetics400_rgb_20210202-61135809.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/20210426_012424.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/20210426_012424.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/tsm_r50_video_1x1x8_50e_diving48_rgb_20210426-aba5aa3d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/20210426_012823.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/20210426_012823.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/tsm_r50_video_1x1x16_50e_diving48_rgb_20210426-aa9631c0.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/20210203_150227.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/20210203_150227.log| 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/tsm_r50_1x1x8_50e_sthv1_rgb_20210203-01dce462.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/20210203_145829.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/20210203_145829.log | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/tsm_r50_flip_1x1x8_50e_sthv1_rgb_20210203-12596f16.pth | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb_20210324-481268d9.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb_20210324-76937692.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/20201010_221240.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/20201010_221240.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/tsm_r50_1x1x16_50e_sthv1_rgb_20201010-17fa49f6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/20201010_224055.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/20201010_224055.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/tsm_r101_1x1x8_50e_sthv1_rgb_20201010-43fedf2e.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20200912_140737.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20200912_140737.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/tsm_r50_1x1x8_50e_sthv2_rgb_20200912-033c4ac6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20210401_143656.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20210401_143656.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/tsm_r50_256h_1x1x8_50e_sthv2_rgb_20210401-df97f3e1.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20201010_224215.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20201010_224215.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/tsm_r50_1x1x16_50e_sthv2_rgb_20201010-16469c6f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20210331_134458.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20210331_134458.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/tsm_r50_256h_1x1x16_50e_sthv2_rgb_20210331-0a45549c.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/20201010_224100.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/20201010_224100.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/tsm_r101_1x1x8_50e_sthv2_rgb_20201010-98cdedb8.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://github.com/mit-han-lab/temporal-shift-module/tree/8d53d6fda40bea2f1b37a6095279c4b454d672bd#training | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb-9eca48e5.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb-34934615.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb-c799267e.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/20210605_182554.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/20210605_182554.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb_20210630-10c74ee5.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/20210605_182505.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/20210605_182505.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb_20210630-4785548e.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/20210605_182720.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/20210605_182720.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb_20210630-1fae312b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/20210605_182720.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/20210605_182720.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb_20210630-8df9c358.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023-d85ab600.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/20210426_014138.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/20210426_014138.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/tsn_r50_video_1x1x8_100e_diving48_rgb_20210426-6dde0185.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/20210426_014103.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/20210426_014103.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/tsn_r50_video_1x1x16_100e_diving48_rgb_20210426-63c5f2f7.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/20201025_231108.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/20201025_231108.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb_20201123-ce6c27ed.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/20201108_190805.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/20201108_190805.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb_20201123-7f84701b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/20201112_170135.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/20201112_170135.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/tsn_r50_1x1x8_50e_hmdb51_mit_rgb_20201123-01526d41.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/20200614_063526.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/20200614_063526.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/tsn_r50_1x1x3_100e_kinetics400_rgb_20200614-e508be42.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/20200725_031325.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/20200725_031325.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/tsn_r50_256p_1x1x3_100e_kinetics400_rgb_20200725-22592236.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/20200627_105310.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/20200627_105310.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/tsn_r50_dense_1x1x5_100e_kinetics400_rgb_20200627-a063165f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_f3_kinetics400_shortedge_70.9_89.5.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_f3_kinetics400_shortedge_70.9_89.5.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_320p_1x1x3_100e_kinetics400_rgb_20200702-cc665e2a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_f3_kinetics400_flow_shortedge_55.7_79.9.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_f3_kinetics400_flow_shortedge_55.7_79.9.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_320p_1x1x3_110e_kinetics400_flow_20200705-3036bab6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/20200815_173413.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/20200815_173413.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_f8_kinetics400_shortedge_72.4_90.6.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_f8_kinetics400_shortedge_72.4_90.6.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_320p_1x1x8_100e_kinetics400_rgb_20200702-ef80e3d7.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_f8_kinetics400_flow_shortedge_57.8_81.0.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_f8_kinetics400_flow_shortedge_57.8_81.0.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_320p_1x1x8_110e_kinetics400_flow_20200705-1f39486b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014-5ae1ee79.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/20200606_003901.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/20200606_003901.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_dense_1x1x8_100e_kinetics400_rgb_20200606-e925e6e3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_100e_kinetics400_rgb.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_100e_kinetics400_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_1x1x8_100e_kinetics400_rgb_20200702-568cde33.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_dense_100e_kinetics400_rgb.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_dense_100e_kinetics400_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb_20200703-0f19175f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb-16a8b561.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb-cbe85332.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_without_omni_1x1x3_kinetics400_rgb_20200926-c133dd49.pth | 源码模型训练权重文件| -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-2863fed0.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015-4db3c461.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015-e381a6c7.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_r50_f8_sthv1_18.1_45.0.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_sthv1.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_r50_1x1x8_50e_sthv1_rgb_20200618-061b9195.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/20200614_211932.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/20200614_211932.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/tsn_r50_1x1x16_50e_sthv1_rgb_20200614-7e2fe4f1.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/20200915_114139.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/20200915_114139.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/tsn_r50_1x1x8_50e_sthv2_rgb_20200915-f3b381a5.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/20200917_105855.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/20200917_105855.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/tsn_r50_1x1x16_50e_sthv2_rgb_20200917-80bc3611.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_f6_mit_26.8_51.6.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_mit.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_r101_f6_mmit_61.1.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_mmit.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_r101_1x1x5_50e_mmit_rgb_20200618-642f450d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/20210228_223327.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/20210228_223327.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb_20210301-7f8da0c6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/20210217_181313.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/20210217_181313.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb_20210301-c0f04a7e.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804-13313f52.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804-8622cf38.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027-011b282b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027-00e5748d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201027.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201027.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201102-24a22f30.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027-fc1dd8e3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027-0b3b49d2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb-805380f6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/x3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/x3d/facebook/x3d_s_facebook_13x6x1_kinetics400_rgb_20201027-623825a0.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/x3d/metafile.yml | https://github.com/facebookresearch/SlowFast/blob/master/MODEL_ZOO.md | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/x3d/metafile.yml | https://github.com/facebookresearch/SlowFast/blob/master/MODEL_ZOO.md | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/x3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/x3d/facebook/x3d_m_facebook_16x5x1_kinetics400_rgb_20201027-3f42382a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/x3d/metafile.yml | https://github.com/facebookresearch/SlowFast/blob/master/MODEL_ZOO.md | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/x3d/metafile.yml | https://github.com/facebookresearch/SlowFast/blob/master/MODEL_ZOO.md | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/20201010_144630.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/20201010_144630.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/tsn_r18_64x1x1_100e_kinetics400_audio_feature_20201012-bf34df6c.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | 三方库下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/docker/Dockerfile | https://github.com/open-mmlab/mmaction2.git /mmaction2 | 三方库下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | 三方库下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/Dockerfile | https://github.com/open-mmlab/mmaction2.git /mmaction2 | 三方库下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download_annotations.sh | http://ec2-52-25-205-214.us-west-2.compute.amazonaws.com/files/activity_net.v1-3.min.jso | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/anet_anno_action.json | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/video_info_new.csv | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download_feature_annotations.sh | https://download.openmmlab.com/mmaction/localization/anet_activity_indexes_val.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download_features.sh | https://docs.google.com/uc?export=download&id=1ISemndlSDS2FtqQOKL0t3Cjj9yk2yznF | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_annotations.sh | https://research.google.com/ava/download/ava_v${VERSION}.zip | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/trainval | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/trainval/ | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_videos_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_train.FAIR.recall_93.9.pkl | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_val.FAIR.recall_93.9.pkl | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_test.FAIR.recall_93.9.pkl | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_vocab.json | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_train.json | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_test.json | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/diving48/download_videos.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_rgb.tar.gz | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/gym/download_annotations.sh | https://sdolivia.github.io/FineGym/resources/dataset/finegym_annotation_info_v1.0.json | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/gym/download_annotations.sh | https://sdolivia.github.io/FineGym/resources/dataset/gym99_train_element_v1.0.txt -O $DATA_DIR/gym99_train_org.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/gym/download_annotations.sh | https://sdolivia.github.io/FineGym/resources/dataset/gym99_val_element.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/hmdb51/download_annotations.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/hmdb51/download_videos.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/hvu/download_annotations.sh | https://github.com/holistic-video-understanding/HVU-Dataset.git | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/kinetics/download_annotations.sh | https://storage.googleapis.com/deepmind-media/Datasets | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_train.csv | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_val.csv | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_test.csv | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/${DATASET}_train.pkl | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/${DATASET}_val.pkl | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/Validation_set/TH14_Temporal_annotations_validation.zip | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/test_set/TH14_Temporal_annotations_test.zip | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_validation_set_mp4.zip | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_Test_set_mp4.zip | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_val_normalized_proposal_list.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_test_normalized_proposal_list.txt | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ucf101/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ucf101/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/_base_/models/c3d_sports1m_pretrained.py | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_pretrain_20201016-dcc47ddc.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/ipcsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_ig65m_20210617-c4b99d38.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/ipcsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_sports1m_20210617-bcc9c0dd.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r50_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/ircsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r50_ig65m_20210617-ce545a37.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/csn/ircsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_sports1m_20210617-bcc9c0dd.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_4x16x1_120e_gym99_flow.py | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/ | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/timesformer/timesformer_divST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/timesformer/timesformer_spaceOnly_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/timesformer/timesformer_jointST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_320p_1x1x8_110e_kinetics400_flow_20200705-1f39486b.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_320p_1x1x8_110e_kinetics400_flow_20200705-1f39486b.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_320p_1x1x8_100e_kinetics400_rgb_20200702-ef80e3d7.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_320p_1x1x8_100e_kinetics400_rgb_20200702-ef80e3d7.pth | 模型权重文件下载路径 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/analysis/report_map.py | ttps://download.openmmlab.com/mmaction/localization/cuhk_anet17_pred.json | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download.py | https://www.youtube.com/watch?v= | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/gym/download.py | https://www.youtube.com/watch?v= | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/hvu/download.py | https://www.youtube.com/watch?v= | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/kinetics/download.py | https://www.youtube.com/watch?v= | 数据集相关文件下载 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/ava/download_videos_parallel.py | https://s3.amazonaws.com/ava-dataset/trainval/ | 数据集相关文件下载 | -| 开发引入 | / | C3D/train.py | https://github.com/open-mmlab/mmaction2/pull/123 | 源码实现 | -| 开发引入 | / | C3D/tools/train.py | https://github.com/open-mmlab/mmaction2/pull/123 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/kinetics/download.py | https://github.com/activitynet/ActivityNet/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/hvu/download.py | https://github.com/activitynet/ActivityNet/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/gym/download.py | https://github.com/activitynet/ActivityNet/blob/master/Crawler/Kinetics/download.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/gym/download.py | https://github.com/activitynet/ActivityNet/blob/master/Crawler/Kinetics/download.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/download.py | https://github.com/activitynet/ActivityNet/blob/master/Crawler/Kinetics/download.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/data/activitynet/convert_proposal_format.py | https://github.com/activitynet/ActivityNet/blob/master/Evaluation/eval_classification.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/argparse.bash | https://github.com/nhoffman/argparse-bash | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/tools/analysis/report_map.py | http://activity-net.org/challenges/2017/evaluation.html | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/utils/precise_bn.py | https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/precise_bn.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/utils/gradcam_utils.py | https://github.com/facebookresearch/SlowFast/blob/master/slowfast/visualization/gradcam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/utils/gradcam_utils.py | https://arxiv.org/pdf/1610.02391.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/utils/gradcam_utils.py | https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/necks/tpn.py | https://arxiv.org/pdf/2004.03548.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/losses/cross_entropy_loss.py | https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/losses/bmn_loss.py | https://arxiv.org/abs/1907.09702 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/losses/bmn_loss.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/localizers/bsn.py | http://arxiv.org/abs/1806.02964 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/localizers/bsn.py | https://github.com/wzmsltw/BSN-boundary-sensitive-network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/localizers/bmn.py | https://arxiv.org/abs/1907.09702 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/localizers/bmn.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/localizers/base.py | https://github.com/open-mmlab/mmaction2/pull/913 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/heads/tpn_head.py | https://arxiv.org/abs/1906.02629 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/heads/misc_head.py | https://arxiv.org/abs/1807.10982 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/common/tam.py | https://arxiv.org/pdf/2005.06803 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/common/lfb.py | https://arxiv.org/abs/1812.05038 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/common/conv2plus1d.py | https://arxiv.org/pdf/1711.11248.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/common/conv_audio.py | https://arxiv.org/abs/2001.08740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/builder.py | https://github.com/open-mmlab/mmaction2/pull/629 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/x3d.py | https://arxiv.org/pdf/2004.04730.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/timesformer.py | https://arxiv.org/abs/2102.05095 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/resnet3d_slowfast.py | https://arxiv.org/abs/1812.03982 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/resnet3d_csn.py | https://arxiv.org/pdf/1711.11248.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/resnet2plus1d.py | https://arxiv.org/abs/1711.11248.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/resnet_tsm.py | https://arxiv.org/abs/1811.08383 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/resnet_tin.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/models/backbones/resnet_audio.py| https://arxiv.org/abs/2001.08740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/ssn_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/loading.py | https://github.com/mikeboers/PyAV | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/loading.py | https://github.com/soft-matter/pims | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/loading.py | https://github.com/PyAV-Org/PyAV/ | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/loading.py | https://github.com/dmlc/decord | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/augmentations.py | https://imgaug.readthedocs.io/en/latest/index.html | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/augmentations.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/augmentations.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/augmentations.py | https://gluon-cv.mxnet.io/_modules/gluoncv/data/transforms/experimental/image.html | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/pipelines/augmentations.py | https://mxnet.apache.org/api/python/docs/_modules/mxnet/image/image.html#LightingAug | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/blending_utils.py | https://github.com/open-mmlab/mmclassification/blob/master/mmcls/models/utils/mixup.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/blending_utils.py | https://github.com/open-mmlab/mmclassification/blob/master/mmcls/models/utils/mixup.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/blending_utils.py | https://github.com/clovaai/CutMix-PyTorch | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/base.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/ava_dataset.py | https://github.com/open-mmlab/mmaction2/pull/567 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/datasets/activitynet_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/core/scheduler/lr_updater.py | https://github.com/deepcs233/TIN/blob/master/main.py#L409-L412 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/core/hooks/output.py | https://stackoverflow.com/questions/31174295/getattr-and-setattr-on-nested-objects | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/core/evaluation/eval_hooks.py | https://github.com/open-mmlab/mmaction2/pull/395 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/core/evaluation/ava_utils.py | https://research.google.com/ava/download.html | 下载数据集 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/mmaction/core/evaluation/ava_evaluation/metrics.py | https://www.robots.ox.ac.uk/~vgg/rg/papers/deselaers-eccv10.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/configs/_base_/models/i3d_r50.py | https://github.com/open-mmlab/mmaction/blob/master/mmaction/models/tenons/backbones/resnet_i3d.py#L329-L332 | 相关参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/additional_need/mmcv/test.py | https://github.com/open-mmlab/mmcv/issues/985 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/additional_need/mmcv/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | C3D/additional_need/mmcv/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/c3d_sports1m_16x1x1_45e_ucf101_rgb_20201021-26655025.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/ipcsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_ig65m_20210617-c4b99d38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/ipcsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_sports1m_20210617-7a7cc5b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r50_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r50_ig65m_20210617-ce545a37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/ircsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_sports1m_20210617-bcc9c0dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-b9b10241.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_ig65m_pretrained_r50_32x2x1_58e_kinetics400_rgb_20210617-86d33018.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-5c933ae1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-3367437a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-c3be9793.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-d565828d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20200803-fc66ce8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb_20200812-9037a758.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i10d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i11d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i12d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/i3d_r50_video_32x2x1_100e_kinetics400_rgb_20200826-e31c6f52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_32x2x1_100e_kinetics400_rgb_20200612-000e4d2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb_20200817-4e90d1d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_32x2x1_100e_kinetics400_rgb_20200616-2bbb4361.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb_20200725-24eb54cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/i3d_r50_32x2x1_100e_kinetics400_rgb_20200614-c25ef9a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_256p_32x2x1_100e_kinetics400_rgb_20200801-7d9f44de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200815-17f84aa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200813-6e6aef1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb_20200814-7c30d5bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i4d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i5d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i6d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i7d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i8d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/i9d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030-b4eaf92b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030-23966b4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030-66f5e046.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030-011f984d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030-59f5d064.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030-0f56ef51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030-168eb098.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030-7da6dfc3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030-c36616e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030-e2890e8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030-62974bac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030-284cfd3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-2863fed0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb_20200826-ab35a529.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_8x8x1_180e_kinetics400_rgb_20200618-3fce5629.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r2plus1d_r34_32x2x1_180e_kinetics400_rgb_20200618-63462eb3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/20200724_201360.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r21d_8x8.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r21d_32x2.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/20200728_021421.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/20200724_201360.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_8x8_69.58_88.36.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r2plus1d_r34_32x2_74.6_91.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/20200728_021421.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_1p.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_8p.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/slowfast_r50_video_4x16x1_256e_kinetics400_rgb_20200826-f85b90c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/slowfast_r50_4x16x1_256e_kinetics400_rgb_20200704-bcde7ed7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb_20200810-863812c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb_20200728-145f1097.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/slowfast_r50_16x8x1_22e_sthv1_rgb_20210630-53355c16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/slowfast_r152_4x16x1_256e_kinetics400_rgb_20210122-bdeb6b87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/slowfast_r101_8x8x1_256e_kinetics400_rgb_20210218-0dd54025.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/slowfast_r101_4x16x1_256e_kinetics400_rgb_20210218-d8b58813.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/20200812_160237.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/20200716_192653.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/20200704_232901.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/20200731_151537.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/20200731_151706.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/20210606_225114.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/20210122_131321.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/20210218_121513.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/20210118_133528.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/20200812_160237.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/20200716_192653.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/20200704_232901.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/20200731_151537.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/20200731_151706.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/20210606_225114.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/20210122_131321.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/20210218_121513.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/20210118_133528.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015-9250f662.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015-81e5153e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014-c9cdc656.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_256e_kinetics400_flow_20200704-6b384243.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/slowonly_r50_4x16x1_256e_kinetics400_rgb_20200704-a69556c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_20200704-decb8568.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb_20200820-75851a7d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb_20200820-bea7701f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb_20210308-e8dd9e82.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb_20210308-0d6e5a69.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow_20201111-66ecdb3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb_20210630-ee8c850f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb_20210630-cee5f725.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb-b56a5389.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912-3f9ce182.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb_20210630-181e1661.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb_20210630-807a9a9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb_20210630-16faeb6a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912-1e8fc736.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111-a9c34b54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_without_omni_8x8x1_kinetics400_rgb_20200926-0c730aef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/so_8x8.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_196e_kinetics400_flow_65.8_86.3.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/so_4x16.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_61.8_83.6.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/20200817_003320.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/20200817_001411.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/20210308_212250.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/20210305_152630.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow_20201111.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/20210606_010231.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/20210606_010153.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/20210605_213503.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/20210605_235410.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/20210605_185256.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8_74.93_91.92.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_196e_kinetics400_flow_65.8_86.3.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/slowonly_r50_4x16_73.02_90.77.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_61.8_83.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/20200817_003320.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/20200817_001411.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/20210308_212250.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/20210305_152630.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow_20201111.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/20210606_010231.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/20210606_010153.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/20210605_213503.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/20210605_235410.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/20210605_185256.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_4x16x1_120e_gym99_flow.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219-032c8e94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/tanet_r50_1x1x8_50e_sthv1_rgb_20210630-f4a48609.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/tanet_r50_1x1x16_50e_sthv1_rgb_20210630-7c19303c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/20210606_205006.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/20210607_155335.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/20210606_205006.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/20210607_155335.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/timesformer/timesformer_divST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/timesformer/timesformer_jointST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/timesformer/timesformer_spaceOnly_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb_20200810-4a146a70.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/tin_r50_1x1x8_40e_sthv2_rgb_20200912-b27a7337.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/tin_r50_1x1x8_40e_sthv1_rgb_20200729-4a33db86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/20200809_142447.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/20200809_142447.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb_20200910-b796d7a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/tpn_tsm_r50_1x1x8_150e_sthv1_rgb_20210311-28de4cd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb_20200923-52629684.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/20200910_134330.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/20200923_151919.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/20200910_134330.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/20200923_151919.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/trn_r50_1x1x8_50e_sthv2_rgb_20210401-773eca7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/trn_r50_1x1x8_50e_sthv1_rgb_20210401-163704a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/20210326_103951.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/20210326_103951.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/tsm_r50_video_1x1x8_50e_diving48_rgb_20210426-aba5aa3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_1x1x8_100e_kinetics400_rgb_20200702-a77f4328.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/tsm_r50_video_1x1x16_50e_diving48_rgb_20210426-aa9631c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb_20210324-481268d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb-9eca48e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219-bf96e6cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb_20210324-76937692.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/tsm_r50_flip_1x1x8_50e_sthv1_rgb_20210203-12596f16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/tsm_r50_dense_1x1x8_50e_kinetics400_rgb_20210701-a54ff3d3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/tsm_r50_dense_1x1x8_100e_kinetics400_rgb_20210701-e3e5e97f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb-34934615.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/tsm_r50_256h_1x1x8_50e_sthv2_rgb_20210401-df97f3e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/tsm_r50_1x1x8_50e_sthv2_rgb_20200912-033c4ac6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/tsm_r50_1x1x8_50e_sthv1_rgb_20210203-01dce462.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20210701-68d582b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb-c799267e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/tsm_r50_1x1x8_100e_kinetics400_rgb_20210701-7ff22268.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/tsm_r50_256h_1x1x16_50e_sthv2_rgb_20210331-0a45549c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/tsm_r50_1x1x16_50e_sthv2_rgb_20201010-16469c6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/tsm_r50_1x1x16_50e_sthv1_rgb_20201010-17fa49f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/tsm_r50_340x256_1x1x16_50e_kinetics400_rgb_20201011-2f27f229.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/tsm_r50_1x1x16_50e_kinetics400_rgb_20210701-7c0c5d54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/tsm_r101_1x1x8_50e_sthv2_rgb_20201010-98cdedb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/tsm_r101_1x1x8_50e_sthv1_rgb_20201010-43fedf2e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb_20200816-b93fd297.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb_20200724-f00f1336.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb_20200724-d8ad84d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/tsm_mobilenetv2_dense_320p_1x1x8_100e_kinetics400_rgb_20210202-61135809.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb_20210630-1fae312b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb_20210630-10c74ee5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb_20210630-8df9c358.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb_20210630-4785548e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/20210426_012424.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_2d_1x1x8_50e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/20210426_012823.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/20210203_145829.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/20210617_103245.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/20210613_034931.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/20200725_031623.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/20201010_224825.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20210401_143656.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20200912_140737.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/20210203_150227.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20210616_021451.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20200607_211800.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20210331_134458.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20201010_224215.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/20201010_221240.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20210621_115844.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20201011_205356.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/20201010_224100.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/20201010_224055.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200815_210253.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200724_120023.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/20200723_220442.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/20210129_024936.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/20210605_182720.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/20210605_182554.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/20210605_182720.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/20210605_182505.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/20210426_012424.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_2d_1x1x8_50e_kinetics400_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/20210426_012823.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/20210203_145829.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/20210617_103245.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/20210613_034931.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/20200725_031623.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/20201010_224825.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20210401_143656.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20200912_140737.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/20210203_150227.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20210616_021451.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20200607_211800.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20210331_134458.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20201010_224215.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/20201010_221240.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20210621_115844.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20201011_205356.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/20201010_224100.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/20201010_224055.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200815_210253.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200724_120023.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/20200723_220442.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/20210129_024936.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/20210605_182720.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/20210605_182554.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/20210605_182720.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/20210605_182505.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.py | https://download.openmmlab.com/mmclassification/v0/resnext/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb_20200703-0f19175f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014-5ae1ee79.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015-e381a6c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015-4db3c461.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_1x1x8_100e_kinetics400_rgb_20200702-568cde33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/tsn_r50_video_1x1x8_100e_diving48_rgb_20210426-6dde0185.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/tsn_r50_video_1x1x16_100e_diving48_rgb_20210426-63c5f2f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_dense_1x1x8_100e_kinetics400_rgb_20200606-e925e6e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/tsn_r50_dense_1x1x5_100e_kinetics400_rgb_20200627-a063165f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb_20210301-7f8da0c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb_20210301-c0f04a7e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804-13313f52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804-8622cf38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_320p_1x1x8_110e_kinetics400_flow_20200705-1f39486b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_320p_1x1x8_100e_kinetics400_rgb_20200702-ef80e3d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_320p_1x1x3_110e_kinetics400_flow_20200705-3036bab6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_320p_1x1x3_100e_kinetics400_rgb_20200702-cc665e2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/tsn_r50_256p_1x1x3_100e_kinetics400_rgb_20200725-22592236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/tsn_r50_1x1x8_50e_sthv2_rgb_20200915-f3b381a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_r50_1x1x8_50e_sthv1_rgb_20200618-061b9195.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/tsn_r50_1x1x8_50e_hmdb51_mit_rgb_20201123-01526d41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb_20201123-7f84701b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb_20201123-ce6c27ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023-d85ab600.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/tsn_r50_1x1x3_100e_kinetics400_rgb_20200614-e508be42.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/tsn_r50_1x1x16_50e_sthv2_rgb_20200917-80bc3611.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/tsn_r50_1x1x16_50e_sthv1_rgb_20200614-7e2fe4f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_r101_1x1x5_50e_mmit_rgb_20200618-642f450d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_without_omni_1x1x3_kinetics400_rgb_20200926-c133dd49.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-2863fed0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027-00e5748d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201102-24a22f30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/event/tsn_r18_1x1x8_100e_hvu_event_rgb_20201027-dea8cd71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027-fc1dd8e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027-0b3b49d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027-011b282b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb-805380f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb-16a8b561.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb-cbe85332.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_dense_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/20210426_014138.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/20210426_014103.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/20200606_003901.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/20200627_105310.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/20210228_223327.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/20210217_181313.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_f8_kinetics400_flow_shortedge_57.8_81.0.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_f8_kinetics400_shortedge_72.4_90.6.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_f3_kinetics400_flow_shortedge_55.7_79.9.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_f3_kinetics400_shortedge_70.9_89.5.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/20200815_173413.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/20200725_031325.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/20200915_114139.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_sthv1.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/20201112_170135.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/20201108_190805.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/20201025_231108.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_mit.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/20200614_063526.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/20200917_105855.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/20200614_211932.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_mmit.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/event/tsn_r18_1x1x8_100e_hvu_event_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_dense_100e_kinetics400_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_100e_kinetics400_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/20210426_014138.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/20210426_014103.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/20200606_003901.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/20200627_105310.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/20210228_223327.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/20210217_181313.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_f8_kinetics400_flow_shortedge_57.8_81.0.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_f8_kinetics400_shortedge_72.4_90.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_f3_kinetics400_flow_shortedge_55.7_79.9.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_f3_kinetics400_shortedge_70.9_89.5.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/20200815_173413.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/20200725_031325.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/20200915_114139.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_r50_f8_sthv1_18.1_45.0.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/20201112_170135.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/20201108_190805.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/20201025_231108.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_f6_mit_26.8_51.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/20200614_063526.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/20200917_105855.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/20200614_211932.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_r101_f6_mmit_61.1.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/event/tsn_r18_1x1x8_100e_hvu_event_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/x3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/x3d/facebook/x3d_s_facebook_13x6x1_kinetics400_rgb_20201027-623825a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition/x3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/x3d/facebook/x3d_m_facebook_16x5x1_kinetics400_rgb_20201027-3f42382a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/tsn_r18_64x1x1_100e_kinetics400_audio_feature_20201012-bf34df6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/20201010_144630.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/20201010_144630.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/DockerFile | https://download.openmmlab.com/mmcv/dist/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/analysis/report_map.py | https://download.openmmlab.com/ | 下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/activitynet/download_annotations.sh | http://ec2-52-25-205-214.us-west-2.compute.amazonaws.com/files/activity_net.v1-3.min.json | 模型配置参数 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/video_info_new.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/anet_anno_action.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/activitynet/download_feature_annotations.sh | https://download.openmmlab.com/mmaction/localization/anet_activity_indexes_val.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/trainval/${vid} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/trainval/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/download_videos_parallel.py | https://s3.amazonaws.com/ava-dataset/trainval/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/download_videos_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_val.FAIR.recall_93.9.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_train.FAIR.recall_93.9.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_test.FAIR.recall_93.9.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_vocab.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_train.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_test.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/diving48/download_videos.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_rgb.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/hmdb51/download_annotations.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/hmdb51/download_videos.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/kinetics/download_annotations.sh | https://storage.googleapis.com/deepmind-media/Datasets/${DATASET}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_val.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_train.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_test.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/${DATASET}_val.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/${DATASET}_train.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/Validation_set/TH14_Temporal_annotations_validation.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/test_set/TH14_Temporal_annotations_test.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_validation_set_mp4.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_Test_set_mp4.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_val_normalized_proposal_list.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_test_normalized_proposal_list.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ucf101/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/C3D/tools/data/ucf101/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/DeepRemaster/public_address_statement.md b/PyTorch/contrib/cv/video/DeepRemaster/public_address_statement.md index a54f8a29c95eb32e42673737aac92ae00ab6a351..29b8f1ed947c9bfe233f0b8bf909769789b5ced9 100644 --- a/PyTorch/contrib/cv/video/DeepRemaster/public_address_statement.md +++ b/PyTorch/contrib/cv/video/DeepRemaster/public_address_statement.md @@ -1,7 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|--------|---------|------------------------|--------| -| 开发引入 | / | DeepRemaster/remaster.py | http://creativecommons.org/licenses/by-nc-sa/4.0/ | license地址 | -| 开发引入 | / | DeepRemaster/remaster.py | https://esslab.jp/~ess/ | 相关说明 | -| 开发引入 | / | DeepRemaster/download_model.sh | http://iizuka.cs.tsukuba.ac.jp/data/remasternet.pth.tar | 预训练模型 | -| 开发引入 | / | DeepRemaster/remaster.py | http://iizuka.cs.tsukuba.ac.jp/index_eng.html | 相关说明 | -| 开发引入 | / | DeepRemaster/example/测试下载链接.txt | https://pan.baidu.com/s/112FiG-RE0dCfQ4RZoSi9pg | 预训练模型 | +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/DeepRemaster/download_model.sh | http://iizuka.cs.tsukuba.ac.jp/data/remasternet.pth.tar | 权重地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/R(2+1)D/public_address_statement.md b/PyTorch/contrib/cv/video/R(2+1)D/public_address_statement.md index e919216c8b3cc226251b9fc7861b023046396e72..cbcbd79e9221b8a9741e51e5f86e83457cc3312e 100644 --- a/PyTorch/contrib/cv/video/R(2+1)D/public_address_statement.md +++ b/PyTorch/contrib/cv/video/R(2+1)D/public_address_statement.md @@ -1,381 +1,621 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ------ | -------------- | -------- | --------------------- |---------------- | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb-49b07bf2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-2be32625.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201127.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201127.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217-40061d5f.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb_20201127.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb_20201127.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb_20201217-0c6d2e98.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb/20210316_122517.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb/20210316_122517.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb_20210316-959829ec.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb/20210316_122517.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb/20210316_122517.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb_20210316-5742e4dd.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb_20201127.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb_20201127.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb_20201217-16378594.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217-6e7c704d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201222.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201222.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201222-f4d209c9.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb_20201217.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb_20201217.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb_20201217-ae225e97.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-b987b516.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-345618cd.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-874e0845.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210224_125052.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210224_125052.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb_20210224-2ae136d9.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb_20210301-19c330b7.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb_20210301-37efcd15.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature_20200619-42a3b111.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809-c9fd14d2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809-10d803ce.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature/bsn_tem_400x100_1x16_20e_activitynet_feature.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature/bsn_pem_400x100_1x16_20e_activitynet_feature.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature/bsn_tem_400x100_1x16_20e_activitynet_feature.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature/bsn_pem_400x100_1x16_20e_activitynet_feature.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_activitynet_feature/bsn_tem_400x100_1x16_20e_activitynet_feature_20200619-cd6accc3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_activitynet_feature/bsn_pem_400x100_1x16_20e_activitynet_feature_20210203-1c27763d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_video/bsn_tem_400x100_1x16_20e_mmaction_video_20200809.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_video/bsn_pem_400x100_1x16_20e_mmaction_video_20200809.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_video/bsn_tem_400x100_1x16_20e_mmaction_video_20200809.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_video/bsn_pem_400x100_1x16_20e_mmaction_video_20200809.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_video/bsn_tem_400x100_1x16_20e_mmaction_video_20200809-ad6ec626.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_video/bsn_pem_400x100_1x16_20e_mmaction_video_20200809-aa861b26.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_clip/bsn_tem_400x100_1x16_20e_mmaction_clip_20200809.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_clip/bsn_pem_400x100_1x16_20e_mmaction_clip_20200809.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_clip/bsn_tem_400x100_1x16_20e_mmaction_clip_20200809.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_clip/bsn_pem_400x100_1x16_20e_mmaction_clip_20200809.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_tem_400x100_1x16_20e_mmaction_clip/bsn_tem_400x100_1x16_20e_mmaction_clip_20200809-0a563554.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/bsn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bsn/bsn_pem_400x100_1x16_20e_mmaction_clip/bsn_pem_400x100_1x16_20e_mmaction_clip_20200809-e32f61e6.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/20201005_144656.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/20201005_144656.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/ssn_r50_450e_thumos14_rgb_20201012-1920ab16.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://github.com/open-mmlab/mmaction/tree/c7e3b7c11fb94131be9b48a8e3d510589addc3ce#Get%20started | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://github.com/open-mmlab/mmaction/tree/c7e3b7c11fb94131be9b48a8e3d510589addc3ce#Get%20started | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://github.com/open-mmlab/mmaction/tree/c7e3b7c11fb94131be9b48a8e3d510589addc3ce#Get%20started | 源码模型训练精度参考 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/mmaction_reference/ssn_r50_450e_thumos14_rgb_ref/ssn_r50_450e_thumos14_rgb_ref_20201014-b6f48f68.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/mmaction_reference/ssn_r50_450e_thumos14_rgb_ref/20201008_103258.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/c3d_sports1m_16x1x1_45e_ucf101_rgb_20201021-26655025.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20200803-fc66ce8d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb_20200812-9037a758.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-d565828d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-c3be9793.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-3367437a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-5c933ae1.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_ig65m_pretrained_r50_32x2x1_58e_kinetics400_rgb_20210617-86d33018.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-b9b10241.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/i3d_r50_32x2x1_100e_kinetics400_rgb_20200614-c25ef9a4.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_256p_32x2x1_100e_kinetics400_rgb_20200801-7d9f44de.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/i3d_r50_video_32x2x1_100e_kinetics400_rgb_20200826-e31c6f52.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_32x2x1_100e_kinetics400_rgb_20200616-2bbb4361.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb_20200725-24eb54cc.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_32x2x1_100e_kinetics400_rgb_20200612-000e4d2a.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb_20200817-4e90d1d5.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200813-6e6aef1b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200815-17f84aa2.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb_20200814-7c30d5bb.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030-b4eaf92b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030-23966b4b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030-66f5e046.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030-011f984d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030-59f5d064.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030-0f56ef51.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030-168eb098.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030-7da6dfc3.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030-c36616e9.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030-e2890e8d.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030-62974bac.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.json | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.log | 源码模型训练日志 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030-284cfd3b.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-2863fed0.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tree/main | R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 源码模型训练权重文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/slowfast-acrn_kinetics400-pretrained-r50_8xb8-8x8x1-cosine-10e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/slowfast-acrn_kinetics400-pretrained-r50_8xb8-8x8x1-cosine-10e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/slowfast-acrn_kinetics400-pretrained-r50_8xb8-8x8x1-cosine-10e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_custom_classes.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/slowfast-acrn_kinetics400-pretrained-r50_8xb8-8x8x1-cosine-10e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_custom_classes.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/csn/ipcsn_ig65m-pretrained-r152-bnfrozen_32x2x1-58e_kinetics400-rgb.py|R(2+1)D/configs/recognition/csn/ipcsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_ig65m_20210617-c4b99d38.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/csn/ipcsn_sports1m-pretrained-r152-bnfrozen_32x2x1-58e_kinetics400-rgb.py|R(2+1)D/configs/recognition/csn/ipcsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_sports1m_20210617-7a7cc5b9.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r50_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r50_ig65m_20210617-ce545a37.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/csn/ircsn_sports1m-pretrained-r152-bnfrozen_32x2x1-58e_kinetics400-rgb.py|R(2+1)D/configs/recognition/csn/ircsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_sports1m_20210617-bcc9c0dd.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_1p.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_1p_perf.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_8p.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_8p_perf.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowonly/slowonly_kinetics400-pretrained-r101_8xb16-8x8x1-20e_ava21-rgb.py|R(2+1)D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_4x16x1_120e_gym99_flow.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/timesformer/timesformer_spaceOnly_8xb8-8x32x1-15e_kinetics400-rgb.py|R(2+1)D/configs/recognition/timesformer/timesformer_divST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/timesformer/timesformer_spaceOnly_8xb8-8x32x1-15e_kinetics400-rgb.py|R(2+1)D/configs/recognition/timesformer/timesformer_jointST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/timesformer/timesformer_spaceOnly_8xb8-8x32x1-15e_kinetics400-rgb.py|R(2+1)D/configs/recognition/timesformer/timesformer_spaceOnly_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tin/tin_kinetics400-pretrained-tsm-r50_1x1x8-50e_kinetics400-rgb.py|R(2+1)D/configs/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/demo/mmaction2_tutorial.ipynb|R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/demo/mmaction2_tutorial.ipynb|R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/demo/mmaction2_tutorial.ipynb|R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/demo/mmaction2_tutorial.ipynb|R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/_base_/models/c3d_sports1m_pretrained.py|R(2+1)D/configs/_base_/models/c3d_sports1m_pretrained.py | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_pretrain_20201016-dcc47ddc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/_base_/models/i3d_r50.py|R(2+1)D/configs/_base_/models/i3d_r50.py | https://github.com/open-mmlab/mmaction/blob/master/mmaction/models/tenons/backbones/resnet_i3d.py#L329-L332 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/evaluation/functional/ava_utils.py|R(2+1)D/mmaction/core/evaluation/ava_utils.py | https://research.google.com/ava/download.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/core/evaluation/eval_hooks.py | https://github.com/open-mmlab/mmaction2/pull/395 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/engine/hooks/output.py|R(2+1)D/mmaction/core/hooks/output.py | https://stackoverflow.com/questions/31174295/getattr-and-setattr-on-nested-objects | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/mmaction/core/scheduler/lr_updater.py | https://github.com/deepcs233/TIN/blob/master/main.py#L409-L412 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/pipelines/augmentations.py | https://imgaug.readthedocs.io/en/latest/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/wrappers.py|R(2+1)D/mmaction/datasets/pipelines/augmentations.py | https://arxiv.org/abs/1909.13719 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/wrappers.py|R(2+1)D/mmaction/datasets/pipelines/augmentations.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/mikeboers/PyAV | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/mikeboers/PyAV | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/soft-matter/pims | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/r2plus1d/metafile.yml|R(2+1)D/mmaction/models/backbones/resnet2plus1d.py | https://arxiv.org/abs/1711.11248 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/soft-matter/pims | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/PyAV-Org/PyAV/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/advanced_guides/customize_pipeline.md|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/dmlc/decord | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/advanced_guides/customize_pipeline.md|R(2+1)D/mmaction/datasets/pipelines/loading.py | https://github.com/dmlc/decord | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/mmaction/datasets/pipelines/augmentations.py | https://gluon-cv.mxnet.io/_modules/gluoncv/data/transforms/experimental/image.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/mmaction/datasets/pipelines/augmentations.py | https://mxnet.apache.org/api/python/docs/_modules/mxnet/image/image.html#LightingAug | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/resnet3d_csn.py|R(2+1)D/mmaction/models/backbones/resnet3d_csn.py | https://arxiv.org/pdf/1711.11248.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowfast/metafile.yml|R(2+1)D/mmaction/models/backbones/resnet3d_slowfast.py | https://arxiv.org/abs/1812.03982 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition_audio/resnet/metafile.yml|R(2+1)D/mmaction/models/backbones/resnet_audio.py | https://arxiv.org/abs/2001.08740 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tsm/metafile.yml|R(2+1)D/mmaction/models/backbones/resnet_tsm.py | https://arxiv.org/abs/1811.08383 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tin/metafile.yml|R(2+1)D/mmaction/models/backbones/resnet_tin.py | https://arxiv.org/abs/2001.06499 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowfast/metafile.yml|R(2+1)D/mmaction/models/backbones/resnet3d_slowfast.py | https://arxiv.org/abs/1812.03982 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/tanet.py|R(2+1)D/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/tanet.py|R(2+1)D/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/timesformer/metafile.yml|R(2+1)D/mmaction/models/backbones/timesformer.py | https://arxiv.org/abs/2102.05095 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/resnet3d_csn.py|R(2+1)D/mmaction/models/common/conv2plus1d.py | https://arxiv.org/pdf/1711.11248.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition_audio/resnet/metafile.yml|R(2+1)D/mmaction/models/common/conv_audio.py | https://arxiv.org/abs/2001.08740 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/resnet3d_csn.py|R(2+1)D/mmaction/models/common/conv2plus1d.py | https://arxiv.org/pdf/1711.11248.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/x3d.py|R(2+1)D/mmaction/models/backbones/x3d.py | https://arxiv.org/pdf/2004.04730.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/lfb/metafile.yml|R(2+1)D/mmaction/models/common/lfb.py | https://arxiv.org/abs/1812.05038 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/tanet.py|R(2+1)D/mmaction/models/common/tam.py | https://arxiv.org/pdf/2005.06803 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/metafile.yml|R(2+1)D/mmaction/models/heads/misc_head.py | https://arxiv.org/abs/1807.10982 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/mmaction/models/heads/tpn_head.py | https://arxiv.org/abs/1906.02629 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/metafile.yml|R(2+1)D/mmaction/models/localizers/bmn.py | https://arxiv.org/abs/1907.09702 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|R(2+1)D/mmaction/models/localizers/bsn.py | http://arxiv.org/abs/1806.02964 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/README.md|R(2+1)D/mmaction/models/localizers/bmn.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|R(2+1)D/mmaction/models/localizers/bsn.py | https://github.com/wzmsltw/BSN-boundary-sensitive-network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/models/localizers/base.py | https://github.com/open-mmlab/mmaction2/pull/913 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|R(2+1)D/mmaction/models/localizers/bsn.py | http://arxiv.org/abs/1806.02964 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|R(2+1)D/mmaction/models/localizers/bsn.py | https://github.com/wzmsltw/BSN-boundary-sensitive-network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/metafile.yml|R(2+1)D/mmaction/models/losses/bmn_loss.py | https://arxiv.org/abs/1907.09702 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/README.md|R(2+1)D/mmaction/models/losses/bmn_loss.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/losses/cross_entropy_loss.py|R(2+1)D/mmaction/models/losses/cross_entropy_loss.py | https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/necks/tpn.py|R(2+1)D/mmaction/models/necks/tpn.py | https://arxiv.org/pdf/2004.03548.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/download.py|R(2+1)D/tools/data/activitynet/download.py | https://github.com/activitynet/ActivityNet/blob/master/Crawler/Kinetics/download.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tests/data/eval_localization/gt.json|R(2+1)D/tools/data/activitynet/download.py | https://www.youtube.com/watch?v= | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/download_annotations.sh|R(2+1)D/tools/data/activitynet/download_annotations.sh | http://ec2-52-25-205-214.us-west-2.compute.amazonaws.com/files/activity_net.v1-3.min.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/espnet/espnet/tree/v.0.10.5 | R(2+1)D/tools/data/activitynet/download_features.sh | https://docs.google.com/uc?export=download&confirm= | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/tools/data/activitynet/download_features.sh | https://docs.google.com/uc?export=download&id=1ISemndlSDS2FtqQOKL0t3Cjj9yk2yznF | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/download_feature_annotations.sh|R(2+1)D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/anet_anno_action.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/download_feature_annotations.sh|R(2+1)D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/video_info_new.csv | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/download_feature_annotations.sh|R(2+1)D/tools/data/activitynet/download_feature_annotations.sh | https://download.openmmlab.com/mmaction/localization/anet_activity_indexes_val.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/convert_proposal_format.py|R(2+1)D/tools/data/activitynet/convert_proposal_format.py | https://github.com/activitynet/ActivityNet/blob/master/Evaluation/eval_classification.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_annotations.sh|R(2+1)D/tools/data/ava/download_annotations.sh | https://research.google.com/ava/download/ava_v | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_videos.sh|R(2+1)D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_videos.sh|R(2+1)D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/trainval/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_videos.sh|R(2+1)D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_videos.sh|R(2+1)D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/trainval/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_videos.sh|R(2+1)D/tools/data/ava/download_videos_parallel.py | https://s3.amazonaws.com/ava-dataset/trainval/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/download_videos.sh|R(2+1)D/tools/data/ava/download_videos_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/fetch_ava_proposals.sh|R(2+1)D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_train.FAIR.recall_93.9.pkl | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/fetch_ava_proposals.sh|R(2+1)D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_val.FAIR.recall_93.9.pkl | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava/fetch_ava_proposals.sh|R(2+1)D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_test.FAIR.recall_93.9.pkl | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/diving48/download_annotations.sh|R(2+1)D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_vocab.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/diving48/download_annotations.sh|R(2+1)D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_train.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/diving48/download_annotations.sh|R(2+1)D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_test.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/diving48/download_videos.sh|R(2+1)D/tools/data/diving48/download_videos.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_rgb.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/activitynet/download.py|R(2+1)D/tools/data/gym/download.py | https://github.com/activitynet/ActivityNet/blob/master/Crawler/Kinetics/download.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/gym/download_annotations.sh|R(2+1)D/tools/data/gym/download_annotations.sh | https://sdolivia.github.io/FineGym/resources/dataset/finegym_annotation_info_v1.0.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/gym/download_annotations.sh|R(2+1)D/tools/data/gym/download_annotations.sh | https://sdolivia.github.io/FineGym/resources/dataset/gym99_train_element_v1.0.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tests/data/eval_localization/gt.json|R(2+1)D/tools/data/gym/download.py | https://www.youtube.com/watch?v= | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/gym/download_annotations.sh|R(2+1)D/tools/data/gym/download_annotations.sh | https://sdolivia.github.io/FineGym/resources/dataset/gym99_val_element.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/hmdb51/download_annotations.sh|R(2+1)D/tools/data/hmdb51/download_annotations.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/hmdb51/download_videos.sh|R(2+1)D/tools/data/hmdb51/download_videos.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/evaluation/functional/ava_utils.py|R(2+1)D/tools/data/hvu/download.py | https://github.com/activitynet/ActivityNet/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/hvu/download_annotations.sh|R(2+1)D/tools/data/hvu/download_annotations.sh | https://github.com/holistic-video-understanding/HVU-Dataset.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tests/data/eval_localization/gt.json|R(2+1)D/tools/data/hvu/download.py | https://www.youtube.com/watch?v= | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/evaluation/functional/ava_utils.py|R(2+1)D/tools/data/kinetics/download.py | https://github.com/activitynet/ActivityNet/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ava_kinetics/README.md|R(2+1)D/tools/data/kinetics/download_annotations.sh | https://storage.googleapis.com/deepmind-media/Datasets/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tests/data/eval_localization/gt.json|R(2+1)D/tools/data/kinetics/download.py | https://www.youtube.com/watch?v= | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/c2d/README.md|R(2+1)D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/c2d/README.md|R(2+1)D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/c2d/README.md|R(2+1)D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/thumos14/download_annotations.sh|R(2+1)D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/Validation_set/TH14_Temporal_annotations_validation.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/thumos14/download_annotations.sh|R(2+1)D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/test_set/TH14_Temporal_annotations_test.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/thumos14/download_videos.sh|R(2+1)D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_validation_set_mp4.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/thumos14/download_videos.sh|R(2+1)D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_Test_set_mp4.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_val_normalized_proposal_list.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_test_normalized_proposal_list.txt | 模型参数相关配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ucf101/download_annotations.sh|R(2+1)D/tools/data/ucf101/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ucf101/download_videos.sh|R(2+1)D/tools/data/ucf101/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tsn/custom_backbones/tsn_imagenet-pretrained-rn101-32x4d_8xb32-1x1x3-100e_kinetics400-rgb.py|R(2+1)D/configs/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.py | https://download.openmmlab.com/mmclassification/v0/resnext/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/evaluation/functional/ava_evaluation/metrics.py|R(2+1)D/mmaction/core/evaluation/ava_evaluation/metrics.py | https://www.robots.ox.ac.uk/~vgg/rg/papers/deselaers-eccv10.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/setup.py|R(2+1)D/setup.py | http://setuptools.readthedocs.io/en/latest/setuptools.html#declaring-platform-specific-dependencies | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/setup.py|R(2+1)D/setup.py | openmmlab@gmail.com | 开发者邮箱配置 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/.github/ISSUE_TEMPLATE/1-bug-report.yml|R(2+1)D/setup.py | https://github.com/open-mmlab/mmaction2 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/train.py | https://github.com/open-mmlab/mmaction2/pull/123 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmsegmentation.git/docker/Dockerfile | R(2+1)D/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/demo/mmaction2_tutorial.ipynb|R(2+1)D/docker/Dockerfile | https://github.com/open-mmlab/mmaction2.git | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/conf.py|R(2+1)D/docs/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/.circleci/test.yml|R(2+1)D/docs/stat.py | https://download | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/conf.py|R(2+1)D/docs_zh_CN/conf.py | https://www.sphinx-doc.org/en/master/usage/configuration.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2 | R(2+1)D/docs_zh_CN/merge_docs.sh | https://github.com/open-mmlab/mmaction2/tree/master/=g | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/.circleci/test.yml|R(2+1)D/docs_zh_CN/stat.py | https://download | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/argparse.bash|R(2+1)D/tools/argparse.bash | https://github.com/nhoffman/argparse-bash | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/make.bat|R(2+1)D/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/tools/train.py | https://github.com/open-mmlab/mmaction2/pull/123 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/make.bat|R(2+1)D/docs_zh_CN/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmpose/blob/main/configs/body_2d_keypoint/topdown_heatmap/coco/hrnet_fp16_coco.md | R(2+1)D/additional_need/mmcv/optimizer.py | https://arxiv.org/abs/1710.03740 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/activitynet_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/activitynet_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|R(2+1)D/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1710.09412 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|R(2+1)D/mmaction/datasets/blending_utils.py | https://github.com/open-mmlab/mmclassification/blob/master/mmcls/models/utils/mixup.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|R(2+1)D/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1905.04899 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/6c1347d7c0fa220a7be99cb19d1a9e8b6cbf7544/mmdet/datasets/builder.py | R(2+1)D/mmaction/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|R(2+1)D/mmaction/datasets/blending_utils.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/base.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/base.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/ava_dataset.py | https://github.com/open-mmlab/mmaction2/pull/567 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/ssn_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/datasets/ssn_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/models/builder.py | https://github.com/open-mmlab/mmaction2/pull/629 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/mmaction/models/builder.py | https://github.com/open-mmlab/mmaction2/pull/629 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/utils/gradcam_utils.py|R(2+1)D/mmaction/utils/gradcam_utils.py | https://github.com/facebookresearch/SlowFast/blob/master/slowfast/visualization/gradcam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/utils/gradcam_utils.py|R(2+1)D/mmaction/utils/gradcam_utils.py | https://arxiv.org/pdf/1610.02391.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/utils/gradcam_utils.py|R(2+1)D/mmaction/utils/gradcam_utils.py | https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/master/configs/recognition/c3d | R(2+1)D/mmaction/utils/precise_bn.py | https://github.com/facebookresearch/fvcore/blob/master/fvcore/nn/precise_bn.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|R(2+1)D/tools/analysis/report_map.py | http://activity-net.org/challenges/2017/evaluation.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/metafile.yml|R(2+1)D/tools/analysis/report_map.py | https://download.openmmlab.com/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|R(2+1)D/tools/data/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-2be32625.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb-49b07bf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/metafile.yml | https://download.openmmlab.com/mmaction/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/acrn/slowfast_acrn_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb_20201217-0c6d2e98.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb_20201217-16378594.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb_20210316-5742e4dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb_20210316-959829ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217-40061d5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb_20201217-1c9b4117.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-874e0845.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-345618cd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb-b987b516.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb_20201217-ae225e97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217-6e7c704d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201222-f4d209c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb_20201127.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb_20201127.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb/20210316_122517.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb/20210316_122517.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201127.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb_20201127.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb_20201217.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201222.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb_20201127.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb_20201127.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb/20210316_122517.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb/20210316_122517.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201127.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb_20201127.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb_20201217.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201217.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/metafile.yml | https://download.openmmlab.com/mmaction/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_20201222.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_context_kinetics_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_custom_classes.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_temporal_max_focal_alpha3_gamma1_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowfast_temporal_max_kinetics_pretrained_r50_8x8x1_cosine_10e_ava22_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_kinetics_pretrained_r101_8x8x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_kinetics_pretrained_r50_4x16x1_20e_ava_rgb_custom_classes.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_nl_kinetics_pretrained_r50_4x16x1_10e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_nl_kinetics_pretrained_r50_8x8x1_10e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 模型权重 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_omnisource_pretrained_r101_8x8x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/ava/slowonly_omnisource_pretrained_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb_20210224-2ae136d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb_20210301-37efcd15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb_20210301-19c330b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210224_125052.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_nl_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210224_125052.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_max_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/detection/lfb/metafile.yml | https://download.openmmlab.com/mmaction/detection/lfb/lfb_avg_kinetics_pretrained_slowonly_r50_4x16x1_20e_ava_rgb/20210301_124812.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature_20200619-42a3b111.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809-c9fd14d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809-10d803ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_9e_activitynet_feature/bmn_400x100_9e_activitynet_feature.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_video/bmn_400x100_2x8_9e_mmaction_video_20200809.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/bmn/metafile.yml | https://download.openmmlab.com/mmaction/localization/bmn/bmn_400x100_2x8_9e_mmaction_clip/bmn_400x100_2x8_9e_mmaction_clip_20200809.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/ssn_r50_450e_thumos14_rgb_20201012-1920ab16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/20201005_144656.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/ssn_r50_450e_thumos14_rgb/20201005_144656.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/mmaction_reference/ssn_r50_450e_thumos14_rgb_ref/20201008_103258.log.json | 源码模型训练日志 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/localization/ssn/metafile.yml | https://download.openmmlab.com/mmaction/localization/ssn/mmaction_reference/ssn_r50_450e_thumos14_rgb_ref/ssn_r50_450e_thumos14_rgb_ref_20201014-b6f48f68.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/c3d_sports1m_16x1x1_45e_ucf101_rgb_20201021-26655025.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/c3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/c3d/c3d_sports1m_16x1x1_45e_ucf101_rgb/20201021_140429.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/ipcsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_ig65m_20210617-c4b99d38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/ipcsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ipcsn_from_scratch_r152_sports1m_20210617-7a7cc5b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r50_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r50_ig65m_20210617-ce545a37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_ig65m_20200807-771c4135.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/ircsn_sports1m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_from_scratch_r152_sports1m_20210617-bcc9c0dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-b9b10241.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_ig65m_pretrained_r50_32x2x1_58e_kinetics400_rgb_20210617-86d33018.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ircsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-5c933ae1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_sports1m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-3367437a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20210617-c3be9793.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/vmz/vmz_ipcsn_from_scratch_r152_32x2x1_180e_kinetics400_rgb_20210617-d565828d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb_20200803-fc66ce8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb_20200812-9037a758.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_r152_32x2x1_58e_kinetics400_rgb/20200728_031952.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/csn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/csn/ircsn_ig65m_pretrained_bnfrozen_r152_32x2x1_58e_kinetics400_rgb/20200809_053132.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/i3d_r50_video_32x2x1_100e_kinetics400_rgb_20200826-e31c6f52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_32x2x1_100e_kinetics400_rgb_20200612-000e4d2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb_20200817-4e90d1d5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_32x2x1_100e_kinetics400_rgb_20200616-2bbb4361.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb_20200725-24eb54cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/i3d_r50_32x2x1_100e_kinetics400_rgb_20200614-c25ef9a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/i3d_r50_256p_32x2x1_100e_kinetics400_rgb_20200801-7d9f44de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200815-17f84aa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb_20200813-6e6aef1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb_20200814-7c30d5bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_video_32x2x1_100e_kinetics400_rgb/20200706_143014.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_32x2x1_100e_kinetics400_rgb/20200612_233836.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_fast_256p_32x2x1_100e_kinetics400_rgb/20200725_031457.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_32x2x1_100e_kinetics400_rgb/20200616_230011.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_dense_256p_32x2x1_100e_kinetics400_rgb/20200725_031604.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_32x2x1_100e_kinetics400_rgb/20200614_060456.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_r50_256p_32x2x1_100e_kinetics400_rgb/20200725_031555.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034909.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_embedded_gaussian_r50_32x2x1_100e_kinetics400_rgb/20200813_034054.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/i3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/i3d/i3d_nl_dot_product_r50_32x2x1_100e_kinetics400_rgb/20200814_044208.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/baseline/tsn_r50_1x1x8_100e_minikinetics_rgb_20201030-b4eaf92b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/googleimage/tsn_r50_1x1x8_100e_minikinetics_googleimage_rgb_20201030-23966b4b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/webimage/tsn_r50_1x1x8_100e_minikinetics_webimage_rgb_20201030-66f5e046.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/insvideo/tsn_r50_1x1x8_100e_minikinetics_insvideo_rgb_20201030-011f984d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/kineticsraw/tsn_r50_1x1x8_100e_minikinetics_kineticsraw_rgb_20201030-59f5d064.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/tsn_r50_1x1x8_100e_minikinetics_rgb/omnisource/tsn_r50_1x1x8_100e_minikinetics_omnisource_rgb_20201030-0f56ef51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/baseline/slowonly_r50_8x8x1_256e_minikinetics_rgb_20201030-168eb098.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/googleimage/slowonly_r50_8x8x1_256e_minikinetics_googleimage_rgb_20201030-7da6dfc3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/webimage/slowonly_r50_8x8x1_256e_minikinetics_webimage_rgb_20201030-c36616e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/insvideo/slowonly_r50_8x8x1_256e_minikinetics_insvideo_rgb_20201030-e2890e8d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/kineticsraw/slowonly_r50_8x8x1_256e_minikinetics_kineticsraw_rgb_20201030-62974bac.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.json | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030.log | 训练日志地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/omnisource/slowonly_r50_8x8x1_256e_minikinetics_rgb/omnisource/slowonly_r50_8x8x1_256e_minikinetics_omnisource_rgb_20201030-284cfd3b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-2863fed0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/omnisource/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb_20200826-ab35a529.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_8x8x1_180e_kinetics400_rgb_20200618-3fce5629.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r2plus1d_r34_32x2x1_180e_kinetics400_rgb_20200618-63462eb3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/20200724_201360.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r21d_8x8.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r21d_32x2.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/20200728_021421.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_video_8x8x1_180e_kinetics400_rgb/20200724_201360.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_8x8_69.58_88.36.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_32x2x1_180e_kinetics400_rgb/r2plus1d_r34_32x2_74.6_91.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/20200728_021421.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_1p.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_1p_perf.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_8p.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/r2plus1d/r2plus1d_ucf101_rgb_8p_perf.py | https://download.openmmlab.com/mmaction/recognition/r2plus1d/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb/r2plus1d_r34_256p_8x8x1_180e_kinetics400_rgb_20200729-aa94765e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/slowfast_r50_video_4x16x1_256e_kinetics400_rgb_20200826-f85b90c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/slowfast_r50_4x16x1_256e_kinetics400_rgb_20200704-bcde7ed7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb_20200810-863812c2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb_20200728-145f1097.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/slowfast_r50_16x8x1_22e_sthv1_rgb_20210630-53355c16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/slowfast_r152_4x16x1_256e_kinetics400_rgb_20210122-bdeb6b87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/slowfast_r101_8x8x1_256e_kinetics400_rgb_20210218-0dd54025.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/slowfast_r101_4x16x1_256e_kinetics400_rgb_20210218-d8b58813.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/20200812_160237.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/20200716_192653.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/20200704_232901.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/20200731_151537.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/20200731_151706.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/20210606_225114.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/20210122_131321.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/20210218_121513.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/20210118_133528.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_video_4x16x1_256e_kinetics400_rgb/20200812_160237.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/20200716_192653.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_4x16x1_256e_kinetics400_rgb/20200704_232901.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_8x8x1_256e_kinetics400_rgb/20200731_151537.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_256p_4x16x1_256e_kinetics400_rgb/20200731_151706.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb/20210606_225114.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r152_4x16x1_256e_kinetics400_rgb/20210122_131321.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_8x8x1_256e_kinetics400_rgb/20210218_121513.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r101_4x16x1_256e_kinetics400_rgb/20210118_133528.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowfast/slowfast_r50_16x8x1_22e_sthv1_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowfast/slowfast_r50_8x8x1_256e_kinetics400_rgb/slowfast_r50_8x8x1_256e_kinetics400_rgb_20200716-73547d2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015-9250f662.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015-81e5153e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014-c9cdc656.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_256e_kinetics400_flow_20200704-6b384243.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/slowonly_r50_4x16x1_256e_kinetics400_rgb_20200704-a69556c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_20200704-decb8568.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb_20200820-75851a7d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb_20200820-bea7701f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb_20210308-e8dd9e82.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb_20210308-0d6e5a69.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow_20201111-66ecdb3c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb_20210630-ee8c850f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb_20210630-cee5f725.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb-b56a5389.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912-3f9ce182.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb_20210630-181e1661.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb_20210630-807a9a9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb_20210630-16faeb6a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912-1e8fc736.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111-a9c34b54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r50_omni_4x16x1_kinetics400_rgb_20200926-51b1f7ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_without_omni_8x8x1_kinetics400_rgb_20200926-0c730aef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/omni/slowonly_r101_omni_8x8x1_kinetics400_rgb_20200926-b5dbb701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/so_8x8.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_196e_kinetics400_flow_65.8_86.3.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/so_4x16.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_61.8_83.6.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/20200817_003320.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/20200817_001411.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/20210308_212250.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/20210305_152630.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow_20201111.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/20210606_010231.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/20210606_010153.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/20210605_213503.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/20210605_235410.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/20210605_185256.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics700_rgb/slowonly_r50_video_8x8x1_256e_kinetics700_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_8x8x1_256e_kinetics600_rgb/slowonly_r50_video_8x8x1_256e_kinetics600_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb/slowonly_r50_video_320p_4x16x1_256e_kinetics400_rgb_20201014.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8_74.93_91.92.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_flow/slowonly_r50_8x8x1_196e_kinetics400_flow_65.8_86.3.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_rgb/slowonly_r50_4x16_73.02_90.77.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_4x16x1_256e_kinetics400_flow/slowonly_r50_4x16x1_256e_kinetics400_flow_61.8_83.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_8x8x1_256e_kinetics400_rgb/20200817_003320.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_256p_4x16x1_256e_kinetics400_rgb/20200817_001411.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_8x8x1_150e_kinetics400_rgb/20210308_212250.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_nl_embedded_gaussian_r50_4x16x1_150e_kinetics400_rgb/20210305_152630.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow/slowonly_kinetics_pretrained_r50_4x16x1_120e_gym99_flow_20201111.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb/20210606_010231.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_hmdb51_rgb/20210606_010153.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb/slowonly_imagenet_pretrained_r50_8x8x1_64e_jester_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_8x8x1_150e_kinetics400_rgb_20200912.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_ucf101_rgb/20210605_213503.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_sthv1_rgb/20210605_235410.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_8x4x1_64e_hmdb51_rgb/20210605_185256.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb/slowonly_imagenet_pretrained_r50_4x16x1_150e_kinetics400_rgb_20200912.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/metafile.yml | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb/slowonly_imagenet_pretrained_r50_4x16x1_120e_gym99_rgb_20201111.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_4x16x1_120e_gym99_flow.py | https://download.open+S2162:S2182mmlab.com/mmaction/recognition/slowonly/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/slowonly/slowonly_k400_pretrained_r50_8x4x1_40e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/slowonly/slowonly_r50_8x8x1_256e_kinetics400_rgb/slowonly_r50_8x8x1_256e_kinetics400_rgb_20200703-a79c555a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219-032c8e94.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/tanet_r50_1x1x8_50e_sthv1_rgb_20210630-f4a48609.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/tanet_r50_1x1x16_50e_sthv1_rgb_20210630-7c19303c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/20210606_205006.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/20210607_155335.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_dense_1x1x8_100e_kinetics400_rgb/tanet_r50_dense_1x1x8_100e_kinetics400_rgb_20210219.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x8_50e_sthv1_rgb/20210606_205006.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tanet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tanet/tanet_r50_1x1x16_50e_sthv1_rgb/20210607_155335.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/timesformer/timesformer_divST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/timesformer/timesformer_jointST_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/timesformer/timesformer_spaceOnly_8x32x1_15e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/timesformer/vit_base_patch16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb_20200810-4a146a70.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/tin_r50_1x1x8_40e_sthv2_rgb_20200912-b27a7337.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/tin_r50_1x1x8_40e_sthv1_rgb_20200729-4a33db86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/20200809_142447.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb/20200809_142447.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv2_rgb/20200912_225451.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tin/tin_r50_1x1x8_40e_sthv1_rgb/20200729_034132.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tin/tin_tsm_finetune_r50_1x1x8_50e_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/tpn_tsm_r50_1x1x8_150e_sthv1_rgb_20210311-28de4cd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb_20200910-b796d7a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb_20200923-52629684.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/20200910_134330.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/20200923_151919.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_tsm_r50_1x1x8_150e_sthv1_rgb/20210311_162636.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_slowonly_r50_8x8x1_150e_kinetics_rgb/20200910_134330.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tpn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tpn/tpn_imagenet_pretrained_slowonly_r50_8x8x1_150e_kinetics_rgb/20200923_151919.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/trn_r50_1x1x8_50e_sthv2_rgb_20210401-773eca7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/trn_r50_1x1x8_50e_sthv1_rgb_20210401-163704a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/20210326_103951.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv2_rgb/20210326_103951.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/trn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/trn/trn_r50_1x1x8_50e_sthv1_rgb/20210326_103948.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/tsm_r50_video_1x1x8_50e_diving48_rgb_20210426-aba5aa3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_1x1x8_100e_kinetics400_rgb_20200702-a77f4328.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/tsm_r50_video_1x1x16_50e_diving48_rgb_20210426-aa9631c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb_20210324-481268d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb-9eca48e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219-bf96e6cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb_20210324-76937692.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/tsm_r50_flip_1x1x8_50e_sthv1_rgb_20210203-12596f16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/tsm_r50_dense_1x1x8_50e_kinetics400_rgb_20210701-a54ff3d3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/tsm_r50_dense_1x1x8_100e_kinetics400_rgb_20210701-e3e5e97f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb-34934615.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/tsm_r50_256h_1x1x8_50e_sthv2_rgb_20210401-df97f3e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/tsm_r50_1x1x8_50e_sthv2_rgb_20200912-033c4ac6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/tsm_r50_1x1x8_50e_sthv1_rgb_20210203-01dce462.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20210701-68d582b4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/tsm_r50_1x1x8_50e_kinetics400_rgb_20200607-af7fb746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb-c799267e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/tsm_r50_1x1x8_100e_kinetics400_rgb_20210701-7ff22268.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/tsm_r50_256h_1x1x16_50e_sthv2_rgb_20210331-0a45549c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/tsm_r50_1x1x16_50e_sthv2_rgb_20201010-16469c6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/tsm_r50_1x1x16_50e_sthv1_rgb_20201010-17fa49f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/tsm_r50_340x256_1x1x16_50e_kinetics400_rgb_20201011-2f27f229.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/tsm_r50_1x1x16_50e_kinetics400_rgb_20210701-7c0c5d54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/tsm_r101_1x1x8_50e_sthv2_rgb_20201010-98cdedb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/tsm_r101_1x1x8_50e_sthv1_rgb_20201010-43fedf2e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb_20200816-b93fd297.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb_20200724-f00f1336.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb_20200724-d8ad84d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/tsm_mobilenetv2_dense_320p_1x1x8_100e_kinetics400_rgb_20210202-61135809.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb_20210630-1fae312b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb_20210630-10c74ee5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb_20210630-8df9c358.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb_20210630-4785548e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/20210426_012424.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_2d_1x1x8_50e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/20210426_012823.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/20210203_145829.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/20210617_103245.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/20210613_034931.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/20200725_031623.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/20201010_224825.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20210401_143656.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20200912_140737.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/20210203_150227.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20210616_021451.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20200607_211800.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20210331_134458.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20201010_224215.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/20201010_221240.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20210621_115844.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20201011_205356.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/20201010_224100.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/20201010_224055.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200815_210253.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200724_120023.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/20200723_220442.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/20210129_024936.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/20210605_182720.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/20210605_182554.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/20210605_182720.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/20210605_182505.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_50e_diving48_rgb/20210426_012424.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x8_100e_kinetics400_rgb/tsm_r50_video_2d_1x1x8_50e_kinetics400_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_video_1x1x16_50e_diving48_rgb/20210426_012823.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_randaugment_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_mixup_1x1x8_50e_sthv1_rgb/tsm_r50_mixup_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb/tsm_r50_gpu_normalize_1x1x8_50e_kinetics400_rgb_20210219.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb/tsm_r50_flip_randaugment_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_flip_1x1x8_50e_sthv1_rgb/20210203_145829.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_50e_kinetics400_rgb/20210617_103245.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_dense_1x1x8_100e_kinetics400_rgb/20210613_034931.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb/tsm_r50_cutmix_1x1x8_50e_sthv1_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/20200725_031623.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/20201010_224825.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20210401_143656.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv2_rgb/20200912_140737.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_sthv1_rgb/20210203_150227.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20210616_021451.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_kinetics400_rgb/20200607_211800.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_50e_jester_rgb/tsm_r50_1x1x8_50e_jester_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x8_100e_kinetics400_rgb/20210617_103543.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20210331_134458.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv2_rgb/20201010_224215.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_sthv1_rgb/20201010_221240.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20210621_115844.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_1x1x16_50e_kinetics400_rgb/20201011_205356.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv2_rgb/20201010_224100.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r101_1x1x8_50e_sthv1_rgb/20201010_224055.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200815_210253.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_embedded_gaussian_r50_1x1x8_50e_kinetics400_rgb/20200724_120023.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_nl_dot_product_r50_1x1x8_50e_kinetics400_rgb/20200723_220442.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_mobilenetv2_dense_1x1x8_100e_kinetics400_rgb/20210129_024936.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb/20210605_182720.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb/20210605_182554.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb/20210605_182720.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb/20210605_182505.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x16_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x16_50e_kinetics400_rgb/tsm_r50_256p_1x1x16_50e_kinetics400_rgb_20201010-85645c2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_hmdb51_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsm/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.py | https://download.openmmlab.com/mmclassification/v0/resnext/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb_20200703-0f19175f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014-5ae1ee79.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015-e381a6c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_1x1x8_100e_kinetics400_rgb_20200702-568cde33.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/tsn_r50_video_1x1x8_100e_diving48_rgb_20210426-6dde0185.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/tsn_r50_video_1x1x16_100e_diving48_rgb_20210426-63c5f2f7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_dense_1x1x8_100e_kinetics400_rgb_20200606-e925e6e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/tsn_r50_dense_1x1x5_100e_kinetics400_rgb_20200627-a063165f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb_20210301-7f8da0c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb_20210301-c0f04a7e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804-13313f52.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804-8622cf38.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_320p_1x1x8_110e_kinetics400_flow_20200705-1f39486b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_320p_1x1x8_100e_kinetics400_rgb_20200702-ef80e3d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_320p_1x1x3_110e_kinetics400_flow_20200705-3036bab6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_320p_1x1x3_100e_kinetics400_rgb_20200702-cc665e2a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/tsn_r50_256p_1x1x3_100e_kinetics400_rgb_20200725-22592236.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/tsn_r50_1x1x8_50e_sthv2_rgb_20200915-f3b381a5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_r50_1x1x8_50e_sthv1_rgb_20200618-061b9195.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/tsn_r50_1x1x8_50e_hmdb51_mit_rgb_20201123-01526d41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb_20201123-7f84701b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb_20201123-ce6c27ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023-d85ab600.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/tsn_r50_1x1x3_100e_kinetics400_rgb_20200614-e508be42.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/tsn_r50_1x1x16_50e_sthv2_rgb_20200917-80bc3611.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/tsn_r50_1x1x16_50e_sthv1_rgb_20200614-7e2fe4f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_r101_1x1x5_50e_mmit_rgb_20200618-642f450d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_imagenet_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-54192355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_without_omni_1x1x3_kinetics400_rgb_20200926-c133dd49.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/omni/tsn_1G1B_pretrained_r50_omni_1x1x3_kinetics400_rgb_20200926-2863fed0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027-00e5748d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201102-24a22f30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/event/tsn_r18_1x1x8_100e_hvu_event_rgb_20201027-dea8cd71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027-fc1dd8e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027-0b3b49d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027-011b282b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb-805380f6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb-16a8b561.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb-cbe85332.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_dense_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/20210426_014138.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/20210426_014103.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/20200606_003901.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/20200627_105310.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/20210228_223327.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/20210217_181313.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_f8_kinetics400_flow_shortedge_57.8_81.0.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_f8_kinetics400_shortedge_72.4_90.6.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_f3_kinetics400_flow_shortedge_55.7_79.9.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_f3_kinetics400_shortedge_70.9_89.5.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/20200815_173413.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/20200725_031325.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/20200915_114139.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_sthv1.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/20201112_170135.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/20201108_190805.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/20201025_231108.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_mit.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/20200614_063526.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/20200917_105855.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/20200614_211932.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_mmit.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/event/tsn_r18_1x1x8_100e_hvu_event_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015-4db3c461.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_dense_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_dense_100e_kinetics400_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_video_320p_1x1x3_100e_kinetics400_rgb_20201014.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics700_rgb/tsn_r50_video_1x1x8_100e_kinetics700_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics600_rgb/tsn_r50_video_1x1x8_100e_kinetics600_rgb_20201015.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_kinetics400_rgb/tsn_r50_video_2d_1x1x8_100e_kinetics400_rgb.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x8_100e_diving48_rgb/20210426_014138.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_video_1x1x16_100e_diving48_rgb/20210426_014103.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x8_100e_kinetics400_rgb/20200606_003901.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_dense_1x1x5_100e_kinetics400_rgb/20200627_105310.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb/20210228_223327.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb/20210217_181313.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow/tsn_r50_320p_1x1x8_150e_activitynet_video_flow_20200804.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow_20200804.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_110e_kinetics400_flow/tsn_r50_f8_kinetics400_flow_shortedge_57.8_81.0.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x8_100e_kinetics400_rgb/tsn_r50_f8_kinetics400_shortedge_72.4_90.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_110e_kinetics400_flow/tsn_r50_f3_kinetics400_flow_shortedge_55.7_79.9.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_320p_1x1x3_100e_kinetics400_rgb/tsn_r50_f3_kinetics400_shortedge_70.9_89.5.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/20200815_173413.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x3_100e_kinetics400_rgb/20200725_031325.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv2_rgb/20200915_114139.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_sthv1_rgb/tsn_r50_f8_sthv1_18.1_45.0.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb/20201112_170135.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb/20201108_190805.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_imagenet_rgb/20201025_231108.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_f6_mit_26.8_51.6.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_75e_ucf101_rgb/tsn_r50_1x1x3_75e_ucf101_rgb_20201023.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x3_100e_kinetics400_rgb/20200614_063526.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv2_rgb/20200917_105855.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x16_50e_sthv1_rgb/20200614_211932.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r101_1x1x5_50e_mmit_rgb/tsn_r101_f6_mmit_61.1.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/scene/tsn_r18_1x1x8_100e_hvu_scene_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/object/tsn_r18_1x1x8_100e_hvu_object_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/event/tsn_r18_1x1x8_100e_hvu_event_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/concept/tsn_r18_1x1x8_100e_hvu_concept_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/attribute/tsn_r18_1x1x8_100e_hvu_attribute_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/hvu/action/tsn_r18_1x1x8_100e_hvu_action_rgb_20201027.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb/tsn_swin_transformer_video_320p_1x1x3_100e_kinetics400_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_rn101_32x4d_320p_1x1x3_100e_kinetics400_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/metafile.yml | https://download.openmmlab.com/mmaction/recognition/tsn/custom_backbones/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb/tsn_dense161_320p_1x1x3_100e_kinetics400_rgb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_kinetics400_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_256p_1x1x8_100e_kinetics400_rgb/tsn_r50_256p_1x1x8_100e_kinetics400_rgb_20200817-883baf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/tsn_r50_1x1x8_50e_hmdb51_mit_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/tsn_r50_1x1x6_100e_mit_rgb/tsn_r50_1x1x6_100e_mit_rgb_20200618-d512ab1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_clip_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_150e_activitynet_video_flow.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_clip_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/tsn/tsn_r50_320p_1x1x8_50e_activitynet_video_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/x3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/x3d/facebook/x3d_s_facebook_13x6x1_kinetics400_rgb_20201027-623825a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition/x3d/metafile.yml | https://download.openmmlab.com/mmaction/recognition/x3d/facebook/x3d_m_facebook_16x5x1_kinetics400_rgb_20201027-3f42382a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/tsn_r18_64x1x1_100e_kinetics400_audio_feature_20201012-bf34df6c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/20201010_144630.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/recognition_audio/resnet/metafile.yml | https://download.openmmlab.com/mmaction/recognition/audio_recognition/tsn_r18_64x1x1_100e_kinetics400_audio_feature/20201010_144630.log.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu60_xsub_limb/slowonly_r50_u48_240e_ntu60_xsub_limb-1d69006a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu60_xsub_keypoint/slowonly_r50_u48_240e_ntu60_xsub_keypoint-f3adabf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu120_xsub_limb/slowonly_r50_u48_240e_ntu120_xsub_limb-803c2317.pth? | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu120_xsub_keypoint/slowonly_r50_u48_240e_ntu120_xsub_keypoint-6736b03f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_gym_limb/slowonly_r50_u48_240e_gym_limb-c0d7b482.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_gym_keypoint/slowonly_r50_u48_240e_gym_keypoint-b07a98a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu60_xsub_limb/slowonly_r50_u48_240e_ntu60_xsub_limb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu60_xsub_keypoint/slowonly_r50_u48_240e_ntu60_xsub_keypoint.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu120_xsub_limb/slowonly_r50_u48_240e_ntu120_xsub_limb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu120_xsub_keypoint/slowonly_r50_u48_240e_ntu120_xsub_keypoint.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_gym_limb/slowonly_r50_u48_240e_gym_limb.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_gym_keypoint/slowonly_r50_u48_240e_gym_keypoint.log | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu60_xsub_limb/slowonly_r50_u48_240e_ntu60_xsub_limb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu60_xsub_keypoint/slowonly_r50_u48_240e_ntu60_xsub_keypoint.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu120_xsub_limb/slowonly_r50_u48_240e_ntu120_xsub_limb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_ntu120_xsub_keypoint/slowonly_r50_u48_240e_ntu120_xsub_keypoint.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_gym_limb/slowonly_r50_u48_240e_gym_limb.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/configs/skeleton/posec3d/metafile.yml | https://download.openmmlab.com/mmaction/skeleton/posec3d/slowonly_r50_u48_240e_gym_keypoint/slowonly_r50_u48_240e_gym_keypoint.json | log地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/index.html | mmcv下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/setup.py | openmmlab@gmail.com | maintainer邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/analysis/report_map.py | https://download.openmmlab.com/ | 下载地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/activitynet/download_annotations.sh | http://ec2-52-25-205-214.us-west-2.compute.amazonaws.com/files/activity_net.v1-3.min.json | 模型配置参数 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/video_info_new.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/activitynet/download_feature_annotations.sh | https://raw.githubusercontent.com/wzmsltw/BSN-boundary-sensitive-network/master/data/activitynet_annotations/anet_anno_action.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/activitynet/download_feature_annotations.sh | https://download.openmmlab.com/mmaction/localization/anet_activity_indexes_val.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/download_videos.sh | https://s3.amazonaws.com/ava-dataset/trainval/${vid} | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/download_videos_gnu_parallel.sh | https://s3.amazonaws.com/ava-dataset/trainval/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/download_videos_parallel.py | https://s3.amazonaws.com/ava-dataset/trainval/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/download_videos_parallel.sh | https://s3.amazonaws.com/ava-dataset/annotations/ava_file_names_trainval_v2.1.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_val.FAIR.recall_93.9.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_train.FAIR.recall_93.9.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ava/fetch_ava_proposals.sh | https://download.openmmlab.com/mmaction/dataset/ava/ava_dense_proposals_test.FAIR.recall_93.9.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_vocab.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_train.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/diving48/download_annotations.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_V2_test.json | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/diving48/download_videos.sh | http://www.svcl.ucsd.edu/projects/resound/Diving48_rgb.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/hmdb51/download_annotations.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/test_train_splits.rar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/hmdb51/download_videos.sh | http://serre-lab.clps.brown.edu/wp-content/uploads/2013/10/hmdb51_org.rar | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/kinetics/download_annotations.sh | https://storage.googleapis.com/deepmind-media/Datasets/${DATASET}.tar.gz | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_val.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_train.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/kinetics/download_backup_annotations.sh | https://download.openmmlab.com/mmaction/dataset/${DATASET}/annotations/kinetics_test.csv | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/${DATASET}_val.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/skeleton/download_annotations.sh | https://download.openmmlab.com/mmaction/posec3d/${DATASET}_train.pkl | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/Validation_set/TH14_Temporal_annotations_validation.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/thumos14/download_annotations.sh | http://crcv.ucf.edu/THUMOS14/test_set/TH14_Temporal_annotations_test.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_validation_set_mp4.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/thumos14/download_videos.sh | https://storage.googleapis.com/thumos14_files/TH14_Test_set_mp4.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_val_normalized_proposal_list.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/thumos14/fetch_tag_proposals.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmaction/filelist/thumos14_tag_test_normalized_proposal_list.txt | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ucf101/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/R(2+1)D/tools/data/ucf101/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/SiamRPN/public_address_statement.md b/PyTorch/contrib/cv/video/SiamRPN/public_address_statement.md index e7fa97117e417e1903b8bdf6cb7333174c014d45..fb3addb4ae0792aa874bd277d261a0a96a5d069c 100644 --- a/PyTorch/contrib/cv/video/SiamRPN/public_address_statement.md +++ b/PyTorch/contrib/cv/video/SiamRPN/public_address_statement.md @@ -1,7 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------|--------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/STVIR/pysot/toolkit/utils/statistics.py | SiamRPN/pysot-master/toolkit/utils/statistics.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/STVIR/pysot/pysot/core/config.py | SiamRPN/pysot-master/pysot/core/config.py | https://arxiv.org/pdf/1812.11703 | 论文地址 | -| 开源代码引入 | https://github.com/STVIR/pysot/pysot/core/config.py | SiamRPN/pysot-master/pysot/core/config.py | https://arxiv.org/pdf/1808.06048 | 论文地址 | -| 开源代码引入 | https://github.com/STVIR/pysot/toolkit/utils/statistics.py | SiamRPN/pysot-master/toolkit/utils/misc.py | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | -| 开源代码引入 | https://github.com/STVIR/pysot/toolkit/utils/statistics.py | SiamRPN/pysot-master/toolkit/utils/region.pyx | fangyi.zhang@vipl.ict.ac.cn | 邮箱地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------|---------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/SiamRPN/pysot-master/pysot/utils/distributed.py | 8.8.8.8 | ip地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/TSM/public_address_statement.md b/PyTorch/contrib/cv/video/TSM/public_address_statement.md index 3d5ca5845d695018cb075a9a66bf669e51d4b7ee..ea0d9dbe81d3042f86095765e75b1bd42b4bede5 100644 --- a/PyTorch/contrib/cv/video/TSM/public_address_statement.md +++ b/PyTorch/contrib/cv/video/TSM/public_address_statement.md @@ -1,84 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/evaluation/functional/ava_utils.py|TSM/mmaction/core/evaluation/ava_utils.py | https://research.google.com/ava/download.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/core/evaluation/eval_hooks.py | https://github.com/open-mmlab/mmaction2/pull/395 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/engine/hooks/output.py|TSM/mmaction/core/hooks/output.py | https://stackoverflow.com/questions/31174295/getattr-and-setattr-on-nested-objects | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tin/README.md | TSM/mmaction/core/scheduler/lr_updater.py | https://github.com/deepcs233/TIN/blob/master/main.py#L409-L412 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/pipelines/augmentations.py | https://imgaug.readthedocs.io/en/latest/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/wrappers.py|TSM/mmaction/datasets/pipelines/augmentations.py | https://arxiv.org/abs/1909.13719 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/wrappers.py|TSM/mmaction/datasets/pipelines/augmentations.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/mikeboers/PyAV | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/mikeboers/PyAV | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/soft-matter/pims | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/soft-matter/pims | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/datasets/transforms/loading.py|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/PyAV-Org/PyAV/ | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/advanced_guides/customize_pipeline.md|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/dmlc/decord | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/r2plus1d/metafile.yml|TSM/mmaction/models/backbones/resnet2plus1d.py | https://arxiv.org/abs/1711.11248 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/advanced_guides/customize_pipeline.md|TSM/mmaction/datasets/pipelines/loading.py | https://github.com/dmlc/decord | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/resnet3d_csn.py|TSM/mmaction/models/backbones/resnet3d_csn.py | https://arxiv.org/pdf/1711.11248.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowfast/metafile.yml|TSM/mmaction/models/backbones/resnet3d_slowfast.py | https://arxiv.org/abs/1812.03982 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition_audio/resnet/metafile.yml|TSM/mmaction/models/backbones/resnet_audio.py | https://arxiv.org/abs/2001.08740 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/slowfast/metafile.yml|TSM/mmaction/models/backbones/resnet3d_slowfast.py | https://arxiv.org/abs/1812.03982 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tin/metafile.yml|TSM/mmaction/models/backbones/resnet_tin.py | https://arxiv.org/abs/2001.06499 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/tanet.py|TSM/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/tsm/metafile.yml|TSM/mmaction/models/backbones/resnet_tsm.py | https://arxiv.org/abs/1811.08383 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/tanet.py|TSM/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition/timesformer/metafile.yml|TSM/mmaction/models/backbones/timesformer.py | https://arxiv.org/abs/2102.05095 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/x3d.py|TSM/mmaction/models/backbones/x3d.py | https://arxiv.org/pdf/2004.04730.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/recognition_audio/resnet/metafile.yml|TSM/mmaction/models/common/conv_audio.py | https://arxiv.org/abs/2001.08740 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/resnet3d_csn.py|TSM/mmaction/models/common/conv2plus1d.py | https://arxiv.org/pdf/1711.11248.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/tanet.py|TSM/mmaction/models/common/tam.py | https://arxiv.org/pdf/2005.06803 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/lfb/metafile.yml|TSM/mmaction/models/common/lfb.py | https://arxiv.org/abs/1812.05038 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/backbones/resnet3d_csn.py|TSM/mmaction/models/common/conv2plus1d.py | https://arxiv.org/pdf/1711.11248.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/metafile.yml|TSM/mmaction/models/heads/misc_head.py | https://arxiv.org/abs/1807.10982 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/README.md | TSM/mmaction/models/heads/tpn_head.py | https://arxiv.org/abs/1906.02629 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/metafile.yml|TSM/mmaction/models/localizers/bmn.py | https://arxiv.org/abs/1907.09702 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/README.md|TSM/mmaction/models/localizers/bmn.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|TSM/mmaction/models/localizers/bsn.py | http://arxiv.org/abs/1806.02964 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|TSM/mmaction/models/localizers/bsn.py | https://github.com/wzmsltw/BSN-boundary-sensitive-network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/models/localizers/base.py | https://github.com/open-mmlab/mmaction2/pull/913 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|TSM/mmaction/models/localizers/bsn.py | http://arxiv.org/abs/1806.02964 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/localizers/bsn.py|TSM/mmaction/models/localizers/bsn.py | https://github.com/wzmsltw/BSN-boundary-sensitive-network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/metafile.yml|TSM/mmaction/models/losses/bmn_loss.py | https://arxiv.org/abs/1907.09702 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/localization/bmn/README.md|TSM/mmaction/models/losses/bmn_loss.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/losses/cross_entropy_loss.py|TSM/mmaction/models/losses/cross_entropy_loss.py | https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/necks/tpn.py|TSM/mmaction/models/necks/tpn.py | https://arxiv.org/pdf/2004.03548.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/evaluation/functional/ava_evaluation/metrics.py|TSM/mmaction/core/evaluation/ava_evaluation/metrics.py | https://www.robots.ox.ac.uk/~vgg/rg/papers/deselaers-eccv10.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/train.py | https://github.com/open-mmlab/mmaction2/pull/123 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/README.md | TSM/config/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/configs/detection/acrn/README.md | TSM/config/tsm_r50_1x1x8_50e_sthv2_rgb.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth | 预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/build_audio_features.py|TSM/dataset/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ucf101/download_annotations.sh|TSM/dataset/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/tools/data/ucf101/download_videos.sh|TSM/dataset/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/user_guides/config.md | TSM/mmcv_need/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/README.md | TSM/mmcv_need/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/user_guides/config.md | TSM/mmcv_need/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/README.md | TSM/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/README.md | TSM/mmcv_need/transformer.py | https://arxiv.org/abs/2002.04745 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/activitynet_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/activitynet_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|TSM/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1710.09412 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|TSM/mmaction/datasets/blending_utils.py | https://github.com/open-mmlab/mmclassification/blob/master/mmcls/models/utils/mixup.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/multimodal/vindlu/modeling_bert.py | TSM/mmaction/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/base.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|TSM/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1905.04899 | 模型相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/models/utils/blending_utils.py|TSM/mmaction/datasets/blending_utils.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/base.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/ava_dataset.py | https://github.com/open-mmlab/mmaction2/pull/567 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/models/builder.py | https://github.com/open-mmlab/mmaction2/pull/629 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/models/builder.py | https://github.com/open-mmlab/mmaction2/pull/629 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/ssn_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/docs/en/notes/changelog.md|TSM/mmaction/datasets/ssn_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/utils/gradcam_utils.py|TSM/mmaction/utils/gradcam_utils.py | https://github.com/facebookresearch/SlowFast/blob/master/slowfast/visualization/gradcam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/utils/gradcam_utils.py|TSM/mmaction/utils/gradcam_utils.py | https://arxiv.org/pdf/1610.02391.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/blob/main/mmaction/utils/gradcam_utils.py|TSM/mmaction/utils/gradcam_utils.py | https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/TSM/config/tsm_k400_pretrained_r50_1x1x8_25e_ucf101_rgb.py | https://download.openmmlab.com/mmaction/recognition/tsm/tsm_r50_256p_1x1x8_50e_kinetics400_rgb/tsm_r50_256p_1x1x8_50e_kinetics400_rgb_20200726-020785e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/TSM/config/tsm_r50_1x1x8_50e_sthv2_rgb.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb32_in1k_20210831-ea4938fc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/TSM/dataset/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/TSM/dataset/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/cv/video/TSN/public_address_statement.md b/PyTorch/contrib/cv/video/TSN/public_address_statement.md index 1cd3c1ceb5fc2148c7c269029bc7e7dcd6c05daf..7d4a544332cce050e2eff43d8f4ae93ee7b992fc 100644 --- a/PyTorch/contrib/cv/video/TSN/public_address_statement.md +++ b/PyTorch/contrib/cv/video/TSN/public_address_statement.md @@ -1,63 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|------| -| 开发引入 | / | TSN/mmaction/datasets/pipelines/loading.py | https://github.com/open-mmlab/mmaction2/pull/89 | 源码实现 | -| 开发引入 | / | TSN/mmaction/models/losses/cross_entropy_loss.py | https://pytorch.org/docs/stable/generated/torch.nn.CrossEntropyLoss.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/common/tam.py | TSN/mmaction/models/backbones/tanet.py | https://arxiv.org/pdf/2005.06803 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tools/data/build_audio_features.py | TSN/dataset/build_audio_features.py | https://pypi.org/project/lws/1.2.6/ | 相关依赖 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/utils/gradcam_utils.py | TSN/mmaction/utils/gradcam_utils.py | https://arxiv.org/pdf/1610.02391.pdf | 论文地址 | -| 开发引入 | / | TSN/mmaction/datasets/pipelines/augmentations.py | https://github.com/tensorflow/tpu/blob/master/models/official/efficientnet/autoaugment.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/utils/blending_utils.py | TSN/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1905.04899 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/datasets/transforms/loading.py | TSN/mmaction/datasets/pipelines/loading.py | https://github.com/mikeboers/PyAV | 源码实现 | -| 开发引入 | / | TSN/mmcv_need/optimizer.py | https://arxiv.org/abs/1710.03740 | 论文地址 | -| 开发引入 | / | TSN/mmaction/core/scheduler/lr_updater.py | https://github.com/deepcs233/TIN/blob/master/main.py#L409-L412 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/roi_heads/shared_heads/lfb.py | TSN/mmaction/models/common/lfb.py | https://arxiv.org/abs/1812.05038 | 论文地址 | -| 开发引入 | / | TSN/mmaction/datasets/pipelines/augmentations.py | https://mxnet.apache.org/api/python/docs/_modules/mxnet/image/image.html#LightingAug | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/datasets/transforms/loading.py | TSN/mmaction/datasets/pipelines/loading.py | https://github.com/soft-matter/pims | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/common/conv2plus1d.py | TSN/mmaction/models/backbones/resnet3d_csn.py | https://arxiv.org/pdf/1711.11248.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/models/localizers/base.py | https://github.com/open-mmlab/mmaction2/pull/913 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/common/tam.py | TSN/mmaction/models/common/tam.py | https://arxiv.org/pdf/2005.06803 | 论文地址 | -| 开发引入 | / | TSN/mmcv_need/transformer.py | https://arxiv.org/abs/2002.04745 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/utils/blending_utils.py | TSN/mmaction/datasets/blending_utils.py | https://arxiv.org/abs/1710.09412 | 论文地址 | -| 开发引入 | / | TSN/mmaction/utils/gradcam_utils.py | https://github.com/facebookresearch/SlowFast/blob/master/slowfast/visualization/gradcam_utils.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/losses/bmn_loss.py | TSN/mmaction/models/localizers/bmn.py | https://arxiv.org/abs/1907.09702 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/necks/tpn.py | TSN/mmaction/models/necks/tpn.py | https://arxiv.org/pdf/2004.03548.pdf | 论文地址 | -| 开发引入 | / | TSN/mmcv_need/epoch_based_runner.py | https://github.com/open-mmlab/mmcv/pull/1108 | 源码实现 | -| 开发引入 | / | TSN/mmaction/datasets/blending_utils.py | https://github.com/open-mmlab/mmclassification/blob/master/mmcls/models/utils/mixup.py | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/losses/bmn_loss.py | TSN/mmaction/models/localizers/bmn.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/models/builder.py | https://github.com/open-mmlab/mmaction2/pull/629 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/utils/gradcam_utils.py | TSN/mmaction/utils/gradcam_utils.py | https://matplotlib.org/3.3.0/tutorials/colors/colormaps.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/losses/bmn_loss.py | TSN/mmaction/models/losses/bmn_loss.py | https://github.com/JJBOY/BMN-Boundary-Matching-Network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/evaluation/functional/ava_evaluation/metrics.py | TSN/mmaction/core/evaluation/ava_evaluation/metrics.py | https://www.robots.ox.ac.uk/~vgg/rg/papers/deselaers-eccv10.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/datasets/ssn_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/utils/blending_utils.py | TSN/mmaction/datasets/blending_utils.py | https://github.com/clovaai/CutMix-PyTorch | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/datasets/ava_dataset.py | https://github.com/open-mmlab/mmaction2/pull/567 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/backbones/resnet3d_slowfast.py | TSN/mmaction/models/backbones/resnet3d_slowfast.py | https://arxiv.org/abs/1812.03982 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/backbones/resnet_tsm.py | TSN/mmaction/models/backbones/resnet_tsm.py | https://arxiv.org/abs/1811.08383 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/train.py | https://github.com/open-mmlab/mmaction2/pull/123 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/common/conv_audio.py | TSN/mmaction/models/common/conv_audio.py | https://arxiv.org/abs/2001.08740 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/common/conv2plus1d.py | TSN/mmaction/models/common/conv2plus1d.py | https://arxiv.org/pdf/1711.11248.pdf | 论文地址 | -| 开发引入 | / | TSN/dataset/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/losses/bmn_loss.py | TSN/mmaction/models/losses/bmn_loss.py | https://arxiv.org/abs/1907.09702 | 论文地址 | -| 开发引入 | / | TSN/mmaction/datasets/builder.py | https://github.com/pytorch/pytorch/issues/973 | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/backbones/resnet_tin.py | TSN/mmaction/models/backbones/resnet_tin.py | https://arxiv.org/abs/2001.06499 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/datasets/transforms/loading.py | TSN/mmaction/datasets/pipelines/loading.py | https://github.com/dmlc/decord | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/localizers/bsn.py | TSN/mmaction/models/localizers/bsn.py | https://github.com/wzmsltw/BSN-boundary-sensitive-network | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/backbones/resnet2plus1d.py | TSN/mmaction/models/backbones/resnet2plus1d.py | https://arxiv.org/abs/1711.11248 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/backbones/x3d.py | TSN/mmaction/models/backbones/x3d.py | https://arxiv.org/pdf/2004.04730.pdf | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/localizers/bsn.py | TSN/mmaction/models/localizers/bsn.py | http://arxiv.org/abs/1806.02964 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/datasets/transforms/wrappers.py | TSN/mmaction/datasets/pipelines/augmentations.py | https://imgaug.readthedocs.io/en/latest/index.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/common/conv_audio.py | TSN/mmaction/models/backbones/resnet_audio.py | https://arxiv.org/abs/2001.08740 | 论文地址 | -| 开发引入 | / | TSN/mmcv_need/optimizer.py | https://pytorch.org/docs/stable/amp.html#torch.cuda.amp.GradScaler | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/tools/data/build_audio_features.py | TSN/dataset/build_audio_features.py | https://github.com/r9y9/deepvoice3_pytorch | 源码实现 | -| 开发引入 | / | TSN/mmaction/models/heads/tpn_head.py | https://arxiv.org/abs/1906.02629 | 论文地址 | -| 开发引入 | / | TSN/mmaction/datasets/pipelines/augmentations.py | https://gluon-cv.mxnet.io/_modules/gluoncv/data/transforms/experimental/image.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/backbones/timesformer.py | TSN/mmaction/models/backbones/timesformer.py | https://arxiv.org/abs/2102.05095 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/models/roi_heads/shared_heads/acrn_head.py | TSN/mmaction/models/heads/misc_head.py | https://arxiv.org/abs/1807.10982 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/evaluation/functional/ava_utils.py | TSN/mmaction/core/evaluation/ava_utils.py | https://research.google.com/ava/download.html | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/engine/hooks/output.py | TSN/mmaction/core/hooks/output.py | https://stackoverflow.com/questions/31174295/getattr-and-setattr-on-nested-objects | 相关说明 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/datasets/base.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/mmaction/datasets/transforms/loading.py | TSN/mmaction/datasets/pipelines/loading.py | https://github.com/PyAV-Org/PyAV/ | 源码实现 | -| 开发引入 | / | TSN/dataset/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 下载文件 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/datasets/activitynet_dataset.py | https://github.com/open-mmlab/mmaction2/pull/286 | 源码实现 | -| 开发引入 | / | TSN/mmaction/datasets/pipelines/augmentations.py | https://arxiv.org/abs/1909.13719 | 论文地址 | -| 开源代码引入 | https://github.com/open-mmlab/mmaction2/docs/en/notes/changelog.md | TSN/mmaction/core/evaluation/eval_hooks.py | https://github.com/open-mmlab/mmaction2/pull/395 | 源码实现 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/TSN/dataset/download_annotations.sh | https://www.crcv.ucf.edu/wp-content/uploads/2019/03/UCF101TrainTestSplits-RecognitionTask.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/contrib/cv/video/TSN/dataset/download_videos.sh | https://www.crcv.ucf.edu/datasets/human-actions/ucf101/UCF101.rar | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/contrib/foundation/ChatGLM3-6B/publish_address_statement.md b/PyTorch/contrib/foundation/ChatGLM3-6B/publish_address_statement.md index 5fea53927e349ec6cce37071d42a4ae20aa69197..61bf1c50f706ab5ce785023ae55792dd9d199676 100644 --- a/PyTorch/contrib/foundation/ChatGLM3-6B/publish_address_statement.md +++ b/PyTorch/contrib/foundation/ChatGLM3-6B/publish_address_statement.md @@ -1,74 +1,5 @@ -|类型 |开源代码地址| 文件名 |公网IP地址/公网URL地址/域名/邮箱地址 |用途说明| -|-|-|-------------------------------------|------------------------------------------------------------------------------------|-| -|README.md|https://github.com/huggingface/transformers.git| README.md |https://github.com/THUDM/ChatGLM3|源码实现| -|README.md|https://gitee.com/ascend/ModelZoo-PyTorch.git| README.md |https://gitee.com/ascend/ModelZoo-PyTorch.git|源码实现| -|README.md|https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/ptes| README.md |https://www.hiascend.com/document/detail/zh/ModelZoo/pytorchframework/ptes|昇腾环境说明| -|README.md|https://huggingface.co/THUDM/chatglm3-6b| README.md |https://huggingface.co/THUDM/chatglm3-6b|权重文件| -|开源代码引入|https://huggingface.co/THUDM/chatglm3-6b| models/modeling_chatglm.py |https://huggingface.co/models?filter=chatglm|源码实现| -|开源代码引入|https://huggingface.co/THUDM/chatglm3-6b| models/modeling_chatglm.py |https://github.com/labmlai/annotated_deep_learning_paper_implementations/blob/master/labml_nn/|源码实现| -|开源代码引入|https://huggingface.co/THUDM/chatglm3-6b| models/modeling_chatglm.py |https://github.com/labmlai/annotated_deep_learning_paper_implementations/blob/master/license|源码实现| -|开源代码引入|https://huggingface.co/THUDM/chatglm3-6b| models/modeling_chatglm.py |https://arxiv.org/pdf/2002.05202.pdf|参考论文地址| -|开源代码引入|https://huggingface.co/THUDM/chatglm3-6b| models/modeling_chatglm.py |https://huggingface.co/docs/transformers/main/en/main_classes/text_generation|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://github.com/google-research/bert/blob/f39e881b169b9d53bea03d2d341b31707a6c052b/optimization.py|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://github.com/google-research/big_vision/blob/f071ce68852d56099437004fd70057597a95f6ef/big_vision/utils.py|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://arxiv.org/abs/1711.05101|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://github.com/pytorch/fairseq/blob/master/fairseq/optim/adafactor.py|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://arxiv.org/abs/1804.04235|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://discuss.huggingface.co/t/t5-finetuning-tips/684/3|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://arxiv.org/abs/1804.04235|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/optimization.py |https://github.com/huggingface/transformers/blob/8395f14de6068012787d83989c3627c3df6a252b/src/transformers/optimization.py|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/pytorch/xla/pull/3609|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/huggingface/optimum-neuron|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py: |https://github.com/huggingface/transformers/pull/25903|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://docs.python.org/3/library/argparse#module-argparse|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/huggingface/transformers/tree/main/examples|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://www.tensorflow.org/tensorboard|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://huggingface.co/docs/safetensors|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/intel/intel-extension-for-pytorch|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://nvidia.github.io/apex/amp|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://huggingface.co/docs/transformers/performance|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://www.wandb.com/|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://www.mlflow.org/|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://huggingface.co/docs/transformers/main_classes/trainer|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://dev-discuss.pytorch.org/t/rethinking-pytorch-fully-sharded-data-parallel-fsdp-from-first-principles/1019|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/pytorch/xla/blob/master/torch_xla/distributed/fsdp/xla_fully_sharded_data_parallel.py|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/microsoft/deepspeed|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/huggingface/transformers/tree/main/examples|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://docs.ray.io/en/latest/tune/api_docs/analysis.html#ray.tune.ExperimentAnalysis.get_best_trial|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://pytorch.org/docs/stable/distributed.html#torch.distributed.init_process_group|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://pytorch.org/get-started/pytorch-2.0/|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://pytorch.org/docs/stable/generated/torch.compile.html?highlight=torch+compile#torch.compile|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://arxiv.org/abs/2310.05914|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/neelsjain/NEFTune|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://arxiv.org/abs/2403.03507|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/jiaweizzhao/GaLore|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/intel/intel-extension-for-pytorch|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://nvidia.github.io/apex/amp.html|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://docs.ray.io/en/latest/tune/api_docs/analysis.html|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://arxiv.org/abs/2310.05914|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/neelsjain/NEFTune|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/huggingface/transformers/issues/10628|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/training_args.py |https://github.com/huggingface/safetensors|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/huggingface/transformers/pull/28321|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://huggingface.co/docs/transformers/model_doc/auto|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://huggingface.co/docs/transformers/peft|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/huggingface/transformers|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://www.github.com/nvidia/apex|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/neelsjain/NEFTune|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://arxiv.org/abs/2310.05914|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/pytorch/torchdistx|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/jiaweizzhao/GaLore|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/jiaweizzhao/GaLore|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/hiyouga/LLaMA-Factory/commit/8664262cde3919e10eaecbd66e8c5d356856362e#diff-ebe08ab14496dfb9e06075f0fdd36799ef6d1535cc4dd4715b74c4e3e06fe3ba|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/intel/intel-extension-for-pytorch|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/huggingface/transformers/pull/4659#issuecomment-643356021|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/pytorch/pytorch/issues/82963|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/huggingface/peft/issues/96|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/huggingface/peft/issues/96|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://github.com/pytorch/pytorch/issues/82963|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://optuna.readthedocs.io/en/stable/reference/generated/optuna.study.create_study.html|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://docs.ray.io/en/latest/tune/api_docs/execution.html#tune-run|参考论文地址| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer.py |https://app.sigopt.com/docs/endpoints/experiments/create|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers_op/trainer_pt_utils.py |https://github.com/numpy/numpy/blob/a47ecdea856986cd60eabbd53265c2ca5916ad5d/doc/source/user/basics.types.rst|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer_pt_utils.py |https://github.com/pytorch/pytorch/issues/16266|源码实现| -|开源代码引入|https://github.com/huggingface/transformers.git| transformers/trainer_pt_utils.py |https://github.com/hiyouga/LLaMA-Factory/commit/8664262cde3919e10eaecbd66e8c5d356856362e#diff-ebe08ab14496dfb9e06075f0fdd36799ef6d1535cc4dd4715b74c4e3e06fe3ba|源码实现| +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/foundation/ChatGLM3-6B/script/transformers_op/training_args.py | https://docs.ray.io/en/latest/tune/api_docs/analysis.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/foundation/ChatGLM3-6B/script/transformers_op/training_args.py | https://arxiv.org/abs/2310.05914 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/contrib/foundation/ChatGLM3-6B/script/transformers_op/training_args.py | https://github.com/neelsjain/NEFTune | 源码地址 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/BERT-NER-Pytorch/public_address_statement.md b/PyTorch/contrib/nlp/BERT-NER-Pytorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..ccb910cb7f8fdd4f55fac0a87e571d3e3a75fbb2 --- /dev/null +++ b/PyTorch/contrib/nlp/BERT-NER-Pytorch/public_address_statement.md @@ -0,0 +1,13 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------|------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/cmrc2018_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/cluener_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/wsc_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/tnews_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/iflytek_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/drcd_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/csl_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/copa_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/cmnli_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/chid_public.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/BERT-NER-Pytorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/afqmc_public.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/MarkupLM_ID2781_for_PyTorch/public_address_statement.md b/PyTorch/contrib/nlp/MarkupLM_ID2781_for_PyTorch/public_address_statement.md index 26d725c5083eec51e1d4cf243e24db63c94e0049..278fc20c4d2307b92d9b37ff6208b93f0b8d7178 100644 --- a/PyTorch/contrib/nlp/MarkupLM_ID2781_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/nlp/MarkupLM_ID2781_for_PyTorch/public_address_statement.md @@ -1,3 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | MarkupLM_ID2781_for_PyTorch/requirements.txt | https://download.pytorch.org/whl/torch_stable.html | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/modeling_markuplm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/SpanBERT/public_address_statement.md b/PyTorch/contrib/nlp/SpanBERT/public_address_statement.md index 0fe541aedc7c70568513b5f4abdf9fa6c000b533..25d2295163bcea922ec36e7846c66163845d8142 100644 --- a/PyTorch/contrib/nlp/SpanBERT/public_address_statement.md +++ b/PyTorch/contrib/nlp/SpanBERT/public_address_statement.md @@ -1,33 +1,17 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------|--------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/file_utils.py | SpanBERT/code/pytorch_pretrained_bert/file_utils_for_network.py | https://github.com/allenai/allennlp | 源码实现 | -| 开发引入 | / | SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/run_tacred.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开发引入 | / | SpanBERT/pretraining/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 论文地址 | -| 开发引入 | / | SpanBERT/code/pytorch_pretrained_bert/file_utils_for_network.py | https://s3.amazonaws.com | 相关说明 | -| 开发引入 | / | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://www.tensorflow.org/install/ | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/run_tacred.py | SpanBERT/code/run_tacred.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/models/pair_bert.py | SpanBERT/pretraining/fairseq/models/pair_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/tokenization.py | SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/run_tacred.py | SpanBERT/code/run_mrqa.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开发引入 | / | SpanBERT/code/download_finetuned.sh | http://dl.fbaipublicfiles.com/fairseq/models/spanbert_ | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/optim/adam.py | SpanBERT/pretraining/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/tokenization.py | SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/optim/adam.py | SpanBERT/pretraining/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 相关说明 | -| 开发引入 | / | SpanBERT/pretraining/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/file_utils.py | SpanBERT/code/pytorch_pretrained_bert/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/models/pair_bert.py | SpanBERT/pretraining/fairseq/models/hf_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/tokenization.py | SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/tokenization.py | SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/pytorch_pretrained_bert/modeling.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf_base.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/modules/adaptive_softmax.py | SpanBERT/pretraining/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/code/run_tacred.py | SpanBERT/code/run_glue.py | https://www.github.com/nvidia/apex | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/modules/highway.py | SpanBERT/pretraining/fairseq/modules/highway.py | https://arxiv.org/abs/1505.00387 | 论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/models/pair_bert.py | SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/SpanBERT.git/pretraining/fairseq/optim/adam.py | SpanBERT/pretraining/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 论文地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/download_finetuned.sh | http://dl.fbaipublicfiles.com/fairseq/models/spanbert_$model.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/file_utils_for_network.py | https://s3.amazonaws.com | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf_base.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/SpanBERT/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/TransformerXL/public_address_statement.md b/PyTorch/contrib/nlp/TransformerXL/public_address_statement.md index 5010aaff4804f285904a82f280cc33f8187dd097..dc2a7b0c35bd16aa9daa9ae9e2f42a0871978237 100644 --- a/PyTorch/contrib/nlp/TransformerXL/public_address_statement.md +++ b/PyTorch/contrib/nlp/TransformerXL/public_address_statement.md @@ -1,11 +1,8 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | http://www.statmt.org/lm-benchmark/1-billion-word-language-modeling-benchmark-r13output.tar.gz | 数据集地址 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | http://mattmahoney.net/dc/enwik8.zip | 数据集地址 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/pytorch/utils/log_uniform_sampler.py | TransformerXL/utils/log_uniform_sampler.py | https://github.com/tensorflow/tensorflow/blob/r1.10/tensorflow/python/ops/candidate_sampling_ops.py | 源码实现 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | http://mattmahoney.net/dc/text8.zip | 数据集地址 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 数据集地址 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-2-v1.zip | 数据集地址 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | http://www.fit.vutbr.cz/~imikolov/rnnlm/simple-examples.tgz | 数据集地址 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | https://raw.githubusercontent.com/salesforce/awd-lstm-lm/master/data/enwik8/prep_enwik8.py | 源码实现 | -| 开源代码引入 | https://github.com/kimiyoung/transformer-xl/tree/master/pytorch/getdata.sh | TransformerXL/getdata.sh | https://github.com/rafaljozefowicz/lm/raw/master/1b_word_vocab.txt | 数据集地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------|------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/TransformerXL/getdata.sh | http://www.statmt.org/lm-benchmark/1-billion-word-language-modeling-benchmark-r13output.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/TransformerXL/getdata.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-2-v1.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/TransformerXL/getdata.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/TransformerXL/getdata.sh | http://mattmahoney.net/dc/text8.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/TransformerXL/getdata.sh | http://mattmahoney.net/dc/enwik8.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/TransformerXL/getdata.sh | http://www.vision.caltech.edu/Image_Datasets/Caltech101/Annotations.tar | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/public_address_statement.md b/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/public_address_statement.md index f869447615e04daae2b8d10e90251af9f486c69d..93f735d76bf461620ee6a133347ac39437f772fb 100644 --- a/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/public_address_statement.md @@ -1,59 +1,24 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/run_classifier.py|albert_ID0335_for_PyTorch/run_classifier.py | https://www.github.com/nvidia/apex | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/run_classifier.py|albert_ID0335_for_PyTorch/run_classifier.py | https://nvidia.github.io/apex/amp.html | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/run_classifier.py|albert_ID0335_for_PyTorch/run_classifier.py | https://code.visualstudio.com/docs/python/debugging#_attach-to-a-local-script | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/metrics/glue_compute_metrics.py|albert_ID0335_for_PyTorch/metrics/glue_compute_metrics.py | https://scikit-learn.org/stable/index.html | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/lr_scheduler.py|albert_ID0335_for_PyTorch/callback/lr_scheduler.py | https://arxiv.org/abs/1711.05101 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/lr_scheduler.py|albert_ID0335_for_PyTorch/callback/lr_scheduler.py | https://arxiv.org/abs/1711.05101 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/file_utils.py|albert_ID0335_for_PyTorch/model/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert.py | https://arxiv.org/abs/1606.08415 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert.py | https://arxiv.org/abs/1606.08415 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert_bright.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_bert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-pytorch_model.bin | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://www.tensorflow.org/install/ | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://arxiv.org/abs/1606.08415 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://arxiv.org/abs/1606.08415 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert_bright.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert_bright.py|albert_ID0335_for_PyTorch/model/modeling_albert_bright.py | https://arxiv.org/abs/1810.04805 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert_bright.py | https://pytorch.org/docs/stable/nn.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert.py | https://arxiv.org/abs/1909.11942 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert_bright.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://arxiv.org/abs/1810.04805 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_albert.py|albert_ID0335_for_PyTorch/model/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#module | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/modeling_utils.py|albert_ID0335_for_PyTorch/model/modeling_utils.py | https://github.com/zihangdai/xlnet/blob/master/modeling.py#L253-L276 | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/tokenization_bert.py|albert_ID0335_for_PyTorch/model/tokenization_bert.py | https://github.com/huggingface/pytorch-pretrained-BERT/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/tokenization_albert.py|albert_ID0335_for_PyTorch/model/tokenization_albert.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/tokenization_bert.py|albert_ID0335_for_PyTorch/model/tokenization_bert.py | https://github.com/huggingface/pytorch-pretrained-BERT/issues/328 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/tokenization_albert.py|albert_ID0335_for_PyTorch/model/tokenization_bert.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/model/tokenization_utils.py|albert_ID0335_for_PyTorch/model/tokenization_utils.py | https://github.com/huggingface/transformers/issues/1133 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/adabound.py|albert_ID0335_for_PyTorch/callback/optimization/adabound.py | https://openreview.net/forum?id=Bkg3g2R9FX | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/adafactor.py|albert_ID0335_for_PyTorch/callback/optimization/adafactor.py | https://arxiv.org/pdf/1804.04235.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/adafactor.py|albert_ID0335_for_PyTorch/callback/optimization/adafactor.py | https://github.com/DeadAt0m/adafactor-pytorch | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/lamb.py|albert_ID0335_for_PyTorch/callback/optimization/lamb.py | https://arxiv.org/abs/1904.00962 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/lars.py|albert_ID0335_for_PyTorch/callback/optimization/lars.py | https://arxiv.org/pdf/1708.03888.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/lookahead.py|albert_ID0335_for_PyTorch/callback/optimization/lookahead.py | https://arxiv.org/abs/1907.08610 | 模型相关说明 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/nadam.py|albert_ID0335_for_PyTorch/callback/optimization/nadam.py | http://cs229.stanford.edu/proj2015/054_report.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/nadam.py|albert_ID0335_for_PyTorch/callback/optimization/nadam.py | http://www.cs.toronto.edu/~fritz/absps/momentum.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/nadam.py|albert_ID0335_for_PyTorch/callback/optimization/nadam.py | https://github.com/pytorch/pytorch/pull/1408 | 源码实现 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/radam.py|albert_ID0335_for_PyTorch/callback/optimization/radam.py | https://arxiv.org/pdf/1908.03265.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/radam.py|albert_ID0335_for_PyTorch/callback/optimization/ralars.py | https://arxiv.org/pdf/1908.03265.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/optimization/lars.py|albert_ID0335_for_PyTorch/callback/optimization/ralars.py | https://arxiv.org/pdf/1708.03888.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/lonePatient/albert_pytorch/blob/master/callback/lr_scheduler.py|albert_ID0335_for_PyTorch/callback/optimization/sgdw.py | https://arxiv.org/abs/1711.0510 | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/metrics/glue_compute_metrics.py | https://scikit-learn.org/stable/index.html | 相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_albert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_albert.py | https://pytorch.org/docs/stable/nn.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_albert_bright.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_albert_bright.py | https://pytorch.org/docs/stable/nn.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/albert_ID0335_for_PyTorch/model/modeling_bert.py | https://pytorch.org/docs/stable/nn.html | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/roberta_for_PyTorch/public_address_statement.md b/PyTorch/contrib/nlp/roberta_for_PyTorch/public_address_statement.md index 95d50f8775385dcb7695ff89fcfe1470fb95064b..ac524d8bd64a7b4e1def885ef6430cfe8243266f 100644 --- a/PyTorch/contrib/nlp/roberta_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/nlp/roberta_for_PyTorch/public_address_statement.md @@ -1,169 +1,81 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|roberta_for_PyTorch/setup.py | https://stackoverflow.com/a/54128391 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|roberta_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/setup.py|roberta_for_PyTorch/setup.py | https://bit.ly/2NLVsgE | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/.github/ISSUE_TEMPLATE.md|roberta_for_PyTorch/setup.py | https://github.com/pytorch/fairseq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/feature_transforms/specaugment.py|roberta_for_PyTorch/fairseq/data/audio/feature_transforms/specaugment.py | https://arxiv.org/abs/1904.08779 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/emformer.py|roberta_for_PyTorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/1803.02155 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/augmented_memory_attention.py|roberta_for_PyTorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.08042 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/augmented_memory_attention.py|roberta_for_PyTorch/fairseq/models/speech_to_text/modules/augmented_memory_attention.py | https://arxiv.org/abs/2005.09137 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/emformer.py|roberta_for_PyTorch/fairseq/models/speech_to_text/modules/emformer.py | https://arxiv.org/abs/2005.09684 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/README.md | roberta_for_PyTorch/docs/conf.py | https://github.com/pytorch/fairseq/tree/master/docs/ | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|roberta_for_PyTorch/docs/conf.py | http://docs.scipy.org/doc/numpy/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|roberta_for_PyTorch/docs/conf.py | https://docs.python.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/conf.py|roberta_for_PyTorch/docs/conf.py | https://pytorch.org/docs/master/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/file_utils.py|roberta_for_PyTorch/fairseq/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.md|roberta_for_PyTorch/fairseq/file_utils.py | https://github.com/huggingface | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md|roberta_for_PyTorch/fairseq/checkpoint_utils.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md|roberta_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N18-1119/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/constrained_decoding/README.md|roberta_for_PyTorch/fairseq/search.py | https://www.aclweb.org/anthology/N19-1090/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/make.bat|roberta_for_PyTorch/docs/make.bat | http://sphinx-doc.org/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_dlm/sequence_generator/multichannel_search.py|roberta_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1904.09751 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/search.py|roberta_for_PyTorch/fairseq/search.py | https://arxiv.org/abs/1611.08562 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/unsupervised/w2vu_generate.py|roberta_for_PyTorch/fairseq_cli/hydra_train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/unsupervised/w2vu_generate.py|roberta_for_PyTorch/fairseq_cli/train.py | https://github.com/facebookresearch/hydra/issues/1126 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py|roberta_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/clab/fast_align | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py|roberta_for_PyTorch/scripts/build_sym_alignment.py | http://github.com/moses-smt/mosesdecoder | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/scripts/build_sym_alignment.py|roberta_for_PyTorch/scripts/build_sym_alignment.py | http://www.statmt.org/moses/?n=Development.GetStarted | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/multiprocessing_bpe_encoder.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/preprocess_GLUE_tasks.sh|roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://gist.github.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/criterions/adaptive_loss.py|roberta_for_PyTorch/fairseq/criterions/adaptive_loss.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/indexed_dataset.py|roberta_for_PyTorch/fairseq/data/indexed_dataset.py | https://github.com/numpy/numpy/issues/5745 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wav2vec/README.md|roberta_for_PyTorch/fairseq/data/mask_tokens_dataset.py | https://arxiv.org/abs/1910.05453 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/constants.py|roberta_for_PyTorch/fairseq/dataclass/constants.py | https://github.com/facebookresearch/hydra/issues/1156 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/dataclass/configs.py|roberta_for_PyTorch/fairseq/dataclass/configs.py | https://github.com/facebookresearch/hydra/issues/1117 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fairseq_incremental_decoder.py|roberta_for_PyTorch/fairseq/models/fairseq_incremental_decoder.py | http://www.telesens.co/2019/04/21/understanding-incremental-decoding-in-fairseq/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md|roberta_for_PyTorch/fairseq/models/fconv.py | https://arxiv.org/abs/1705.03122 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/docs/getting_started.rst|roberta_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md|roberta_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/conv_seq2seq/README.md|roberta_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|roberta_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/fconv_self_att.py|roberta_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/stories/README.md|roberta_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/lightconv.py|roberta_for_PyTorch/fairseq/models/lightconv.py | https://openreview.net/pdf?id=SkVhlh09tX | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pay_less_attention_paper/README.md|roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/language_model/README.adaptive_inputs.md|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer_lm.py|roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/criterions/adaptive_loss.py|roberta_for_PyTorch/fairseq/modules/adaptive_softmax.py | http://arxiv.org/abs/1609.04309 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/character_token_embedder.py|roberta_for_PyTorch/fairseq/modules/character_token_embedder.py | https://arxiv.org/abs/1505.00387 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md|roberta_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://arxiv.org/abs/1910.11555 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/dynamic_crf_layer.py|roberta_for_PyTorch/fairseq/modules/dynamic_crf_layer.py | https://github.com/kmkurn/pytorch-crf/blob/master/torchcrf/__init__.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/gelu.py|roberta_for_PyTorch/fairseq/modules/gelu.py | https://github.com/hendrycks/GELUs | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/layerdrop/README.md|roberta_for_PyTorch/fairseq/modules/layer_drop.py | https://arxiv.org/abs/1909.11556 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/sparse_multihead_attention.py|roberta_for_PyTorch/fairseq/modules/sparse_multihead_attention.py | https://arxiv.org/pdf/1904.10509.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/modules/vggblock.py|roberta_for_PyTorch/fairseq/modules/vggblock.py | https://arxiv.org/pdf/1409.1556.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adafactor.py|roberta_for_PyTorch/fairseq/optim/adafactor.py | https://arxiv.org/abs/1804.04235 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|roberta_for_PyTorch/fairseq/optim/adamax.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|roberta_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1711.05101 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|roberta_for_PyTorch/fairseq/optim/adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|roberta_for_PyTorch/fairseq/optim/adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/bmuf.py|roberta_for_PyTorch/fairseq/optim/bmuf.py | https://ieeexplore.ieee.org/document/7472805 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|roberta_for_PyTorch/fairseq/optim/fused_adam.py | https://arxiv.org/abs/1412.6980 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/adam.py|roberta_for_PyTorch/fairseq/optim/fused_adam.py | https://openreview.net/forum?id=ryQu7f-RZ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md|roberta_for_PyTorch/fairseq/scoring/tokenizer.py | https://github.com/mjpost/sacrebleu | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/cross_lingual_language_model/README.md|roberta_for_PyTorch/fairseq/tasks/cross_lingual_lm.py | https://arxiv.org/pdf/1901.07291.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/fairseq_task.py|roberta_for_PyTorch/fairseq/tasks/fairseq_task.py | https://arxiv.org/abs/2010.00904 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/fairseq_task.py|roberta_for_PyTorch/fairseq/tasks/fairseq_task.py | https://github.com/facebookresearch/GENRE | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/nonautoregressive_translation/README.md|roberta_for_PyTorch/fairseq/tasks/translation_lev.py | https://arxiv.org/abs/1905.11006 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/tasks/translation_lev.py|roberta_for_PyTorch/fairseq/tasks/translation_lev.py | https://www.aclweb.org/anthology/2020.acl-main.325/ | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/roberta/commonsense_qa/download_cqa_data.sh|roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libbase/balanced_assignment.cpp|roberta_for_PyTorch/fairseq/clib/libbase/balanced_assignment.cpp | https://dspace.mit.edu/bitstream/handle/1721.1/3265/P-2108-26912652.pdf | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libbase/balanced_assignment.cpp|roberta_for_PyTorch/fairseq/clib/libbase/balanced_assignment.cpp | https://github.com/bkj/auction-lap | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/clib/libnat_cuda/binding.cpp|roberta_for_PyTorch/fairseq/clib/libnat_cuda/binding.cpp | https://github.com/1ytic/pytorch-edit-distance | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/bart/README.summarization.md|roberta_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/encoders/gpt2_bpe_utils.py|roberta_for_PyTorch/fairseq/data/encoders/gpt2_bpe_utils.py | https://github.com/openai/gpt-2/blob/master/src/encoder.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/data/audio/speech_to_text_dataset.py|roberta_for_PyTorch/fairseq/data/audio/speech_to_text_dataset.py | https://arxiv.org/abs/1907.05019 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/README.md | roberta_for_PyTorch/fairseq/models/bart/hub_interface.py | https://github.com/pytorch/fairseq/tree/master/examples/bart | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/bart/model.py|roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/README.md | roberta_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/tree/master/examples/roberta | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model.py|roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/hub_interface.py|roberta_for_PyTorch/fairseq/models/roberta/hub_interface.py | https://github.com/pytorch/fairseq/issues/1306 | 模型相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_camembert.py|roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/gottbert/README.md|roberta_for_PyTorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/roberta/model_xlmr.py|roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/config/LibriSpeech/AST.yaml | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://github.com/eske/seq2seq/blob/master/translate/models.py | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/convtransformer.py|roberta_for_PyTorch/fairseq/models/speech_to_text/convtransformer.py | https://arxiv.org/abs/2004.10234 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/modules/convolution.py|roberta_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1911.08460 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py|roberta_for_PyTorch/fairseq/models/speech_to_text/s2t_transformer.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1409.0473 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1802.04200 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/1909.06515 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/speech_to_text/berard.py|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/pdf/2002.01320.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/speech_to_text/README.md|roberta_for_PyTorch/fairseq/models/speech_to_text/berard.py | https://arxiv.org/abs/2006.12124 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/pointer_generator/pointer_generator_src/transformer_pg.py|roberta_for_PyTorch/fairseq/models/transformer/transformer_base.py | https://arxiv.org/abs/1706.03762 | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/scaling_nmt/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/backtranslation/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/translation/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/models/transformer/transformer_legacy.py|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/wmt20/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/flores101/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/flores101/README.md|roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 相应预训练模型 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md|roberta_for_PyTorch/fairseq/model_parallel/modules/multihead_attention.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md|roberta_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/examples/megatron_11b/README.md|roberta_for_PyTorch/fairseq/model_parallel/modules/transformer_layer.py | https://arxiv.org/pdf/1909.08053.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py|roberta_for_PyTorch/fairseq/optim/lr_scheduler/cosine_lr_scheduler.py | https://arxiv.org/pdf/1608.03983.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py|roberta_for_PyTorch/fairseq/optim/lr_scheduler/triangular_lr_scheduler.py | https://arxiv.org/pdf/1506.01186.pdf | 参考论文地址 | -| 开源代码引入 | https://github.com/facebookresearch/fairseq/blob/main/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py|roberta_for_PyTorch/fairseq/optim/lr_scheduler/tri_stage_lr_scheduler.py | https://arxiv.org/pdf/1904.08779.pdf | 参考论文地址 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 模型参数相关配置 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.xsum.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.lightconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.lightconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/iwslt14.de-en.dynamicconv.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt17.zh-en.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt16.en-de.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/lightconv.py | https://dl.fbaipublicfiles.com/fairseq/models/dynamicconv/wmt14.en-fr.joined-dict.dynamicconv-glu.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-wikipedia-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-oscar-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet-4gb.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base-ccnet.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_camembert.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_gottbert.py | https://dl.gottbert.de/fairseq/models/gottbert-base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xxl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr/xlmr.xl.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/roberta/model_xlmr.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.ta-en.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.iu-en.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-ta.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.nh.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt20.en-iu.news.single.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_615M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/flores101/pretrained_models/flores101_mm100_175M.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer/transformer_legacy.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.ta.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.nh.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.iu.news.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt20.en.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.v2.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/roberta_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/contrib/nlp/tinybert/public_address_statement.md b/PyTorch/contrib/nlp/tinybert/public_address_statement.md index bed42b0485b34485a141cf6ae2284c6fa46aec2a..d65b7ef15f19f8aa1ee5f6762ea77e0f3f0fdf6d 100644 --- a/PyTorch/contrib/nlp/tinybert/public_address_statement.md +++ b/PyTorch/contrib/nlp/tinybert/public_address_statement.md @@ -1,32 +1,23 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开发引入 | / | tinybert/transformer/modeling_for_finetune.py | https://www.tensorflow.org/install/ | 相关依赖 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 预训练模型 | -| 开发引入 | / | tinybert/transformer/tokenization.py | https://en.wikipedia.org/wiki/CJK_Unified_Ideographs_ | 相关说明 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://arxiv.org/abs/1606.08415 | 论文地址 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/file_utils.py | tinybert/transformer/file_utils.py | https://github.com/allenai/allennlp | 源码实现 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 预训练模型 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/modeling.py | tinybert/transformer/modeling_for_finetune.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开发引入 | / | tinybert/transformer/modeling.py | https://www.tensorflow.org/install/ | 相关依赖 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 相关配置 | -| 开源代码引入 | https://github.com/huawei-noah/Pretrained-Language-Model/TinyBERT/transformer/tokenization.py | tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 相关配置 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/modeling_for_finetune.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/contrib/nlp/tinybert/transformer/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/contrib/others/C51/public_address_statement.md b/PyTorch/contrib/others/C51/public_address_statement.md index c981e0377efed857ccf4e6e0b2e37d6918ffd1fa..ee8d178448aad1b19ed318e52a5075a58b44cbe1 100644 --- a/PyTorch/contrib/others/C51/public_address_statement.md +++ b/PyTorch/contrib/others/C51/public_address_statement.md @@ -1,25 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/utils/torch_utils.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/setup.py | https://github.com/ShangtongZhang/DeepRL | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/examples.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/network/network_heads.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/component/envs.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/component/replay.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/network/network_bodies.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/agent/BaseAgent.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/agent/CategoricalDQN_agent.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/setup.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/utils/config.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/network/network_utils.py | C51/deep_rl/network/network_utils.py | https://github.com/saj1919/RL-Adventure/blob/master/5.noisy%20dqn.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/agent/DQN_agent.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/utils/normalizer.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/component/random_process.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/utils/misc.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/component/envs.py | C51/deep_rl/component/envs.py | https://github.com/ikostrikov/pytorch-a2c-ppo-acktr/blob/master/envs.py | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/utils/sum_tree.py | C51/deep_rl/utils/sum_tree.py | https://github.com/rlcode/per/blob/master/SumTree.py | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/utils/schedule.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/utils/logger.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/utils/torch_utils.py | C51/deep_rl/utils/torch_utils.py | https://github.com/pytorch/pytorch/issues/12160 | 相关说明 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | C51/deep_rl/network/network_utils.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/utils/torch_utils.py | C51/deep_rl/utils/torch_utils.py | https://discuss.pytorch.org/t/batch-of-diagonal-matrix/13560 | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------|------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/others/C51/setup.py | zhangshangtong.cpp@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/others/DLRM/public_address_statement.md b/PyTorch/contrib/others/DLRM/public_address_statement.md index c98d9509c788a36e6357082426ac3c3555da9fea..0f6082640ad64aaa2228bdc31d522b5d920024b8 100644 --- a/PyTorch/contrib/others/DLRM/public_address_statement.md +++ b/PyTorch/contrib/others/DLRM/public_address_statement.md @@ -1,14 +1,4 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|---|----------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------|--------| -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_data_pytorch.py | DLRM/dlrm_data_pytorch.py | https://labs.criteo.com/2014/02/kaggle-display-advertising-challenge-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_s_pytorch.py | DLRM/dlrm_s_pytorch.py | https://github.com/facebookresearch/dlrm/issues/172 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_data_pytorch.py | DLRM/dlrm_data_caffe2.py | https://labs.criteo.com/2014/02/kaggle-display-advertising-challenge-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_s_pytorch.py | DLRM/dlrm_s_caffe2.py | https://github.com/facebookresearch/dlrm/issues/172 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_data_pytorch.py | DLRM/dlrm_data_caffe2.py | https://labs.criteo.com/2013/12/download-terabyte-click-logs | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_s_caffe2.py | DLRM/dlrm_s_caffe2.py | https://github.com/caffe2/tutorials/blob/master/MNIST.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_s_caffe2.py | DLRM/dlrm_s_caffe2.py | https://github.com/pytorch/pytorch/issues/9533 | 相关说明 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_data_pytorch.py | DLRM/data_utils.py | https://labs.criteo.com/2014/02/kaggle-display-advertising-challenge-dataset | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/tools/visualize.py | DLRM/tools/visualize.py | https://scikit-learn.org/stable/modules/generated/sklearn.manifold.TSNE.html | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_data_pytorch.py | DLRM/data_utils.py | https://labs.criteo.com/2013/12/download-terabyte-click-logs | 数据集地址 | -| 开源代码引入 | https://github.com/facebookresearch/dlrm/dlrm_data_pytorch.py | DLRM/dlrm_data_pytorch.py | https://labs.criteo.com/2013/12/download-terabyte-click-logs | 数据集地址 | -| 开发引入 | / | DLRM/tools/visualize.py | https://umap-learn.readthedocs.io/en/latest/ | 数据集地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------|--------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/others/DLRM/data_utils.py | http://images.cocodataset.org/annotations/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/DLRM/data_utils.py | https://labs.criteo.com/2013/12/download-terabyte-click-logs | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/others/DQN/public_address_statement.md b/PyTorch/contrib/others/DQN/public_address_statement.md index 51995dcfe384c9c4fa749ac52b7322755c4226d5..7a285c6044c06797492a84e94bada2cb7aa513d0 100644 --- a/PyTorch/contrib/others/DQN/public_address_statement.md +++ b/PyTorch/contrib/others/DQN/public_address_statement.md @@ -1,30 +1,7 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|------------------------|--------| -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/component/random_process.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/network/network_utils.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/utils/normalizer.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/utils/schedule.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/utils/torch_utils.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开发引入 | / | DQN/Dockerfile | https://www.roboti.us/download/mujoco200_linux.zip | 下载链接 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/agent/DQN_agent.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/Dockerfile | DQN/Dockerfile | https://github.com/ShangtongZhang/dm_control2gym.git@scalar_fix | 邮箱地址 | -| 开发引入 | / | DQN/Dockerfile | https://conda.anaconda.org/angloyna | 相关说明 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/utils/torch_utils.py | DQN/deep_rl/utils/torch_utils.py | https://discuss.pytorch.org/t/batch-of-diagonal-matrix/13560 | 相关说明 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/component/envs.py | DQN/deep_rl/component/envs.py | https://github.com/ikostrikov/pytorch-a2c-ppo-acktr/blob/master/envs.py | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/network/network_bodies.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/network/network_utils.py | DQN/deep_rl/network/network_utils.py | https://github.com/saj1919/RL-Adventure/blob/master/5.noisy%20dqn.ipynb | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/Dockerfile | DQN/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | 相关依赖 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/component/envs.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/utils/sum_tree.py | DQN/deep_rl/utils/sum_tree.py | https://github.com/rlcode/per/blob/master/SumTree.py | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/setup.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/examples.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/setup.py | https://github.com/ShangtongZhang/DeepRL | 源码实现 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/utils/logger.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开发引入 | / | DQN/Dockerfile | https://www.roboti.us/download/mjpro150_linux.zip | 下载链接 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/utils/misc.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/component/replay.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/deep_rl/utils/torch_utils.py | DQN/deep_rl/utils/torch_utils.py | https://github.com/pytorch/pytorch/issues/12160 | 相关说明 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/agent/BaseAgent.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/utils/config.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | -| 开源代码引入 | https://github.com/ShangtongZhang/DeepRL/setup.py | DQN/deep_rl/network/network_heads.py | zhangshangtong.cpp@gmail.com | 邮箱地址 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------|-----------------------------------------------------------------------|----------------| +| ModelZoo-PyTorch/PyTorch/contrib/others/DQN/Dockerfile | https://repo.anaconda.com/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda下载链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/DQN/Dockerfile | https://www.roboti.us/download/mujoco200_linux.zip | mujoco200下载链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/DQN/Dockerfile | https://conda.anaconda.org/angloyna nvidia-apex | 三方库连接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/DQN/Dockerfile | https://www.roboti.us/download/mjpro150_linux.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/DQN/setup.py | zhangshangtong.cpp@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/contrib/others/MarkupLM_ID2781_for_PyTorch/public_address_statement.md b/PyTorch/contrib/others/MarkupLM_ID2781_for_PyTorch/public_address_statement.md index aca63a5757e80fc70315a87622efcb680752c681..ccd977a16c63acfe2dfd406b79ce89e9612b0266 100644 --- a/PyTorch/contrib/others/MarkupLM_ID2781_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/others/MarkupLM_ID2781_for_PyTorch/public_address_statement.md @@ -1,21 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|------------------------|--------| -| 开发引入 | / | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/configuration_markuplm.py | https://huggingface.co/microsoft/markuplm-base-uncased | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm_fast.py | https://huggingface.co/microsoft/markuplm-base/resolve/main/tokenizer.json | 相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/xtune/src/run_tag.py | MarkupLM_ID2781_for_PyTorch/examples/fine_tuning/run_swde/run.py | https://nvidia.github.io/apex/amp.html | 相关依赖 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm_fast.py | https://huggingface.co/microsoft/markuplm-large/resolve/main/merges.txt | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm.py | https://huggingface.co/microsoft/markuplm-large/resolve/main/vocab.json | 相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm_fast.py | https://huggingface.co/microsoft/markuplm-large/resolve/main/vocab.json | 相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm_fast.py | https://huggingface.co/microsoft/markuplm-large/resolve/main/tokenizer.json | 相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm.py | https://huggingface.co/microsoft/markuplm-base/resolve/main/vocab.json | 相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm.py | https://huggingface.co/microsoft/markuplm-base/resolve/main/merges.txt | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/examples/fine_tuning/run_swde/pack_data.py | MarkupLM_ID2781_for_PyTorch/examples/fine_tuning/run_swde/pack_data.py | http://shortn/_g22KuARPAi | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/configuration_markuplm.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/configuration_markuplm.py | https://huggingface.co/microsoft/markuplm-base/resolve/main/config.json | 相关配置 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/configuration_markuplm.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/configuration_markuplm.py | https://huggingface.co/microsoft/markuplm-large/resolve/main/config.json | 相关配置 | -| 开发引入 | / | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/modeling_markuplm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm_fast.py | https://huggingface.co/microsoft/markuplm-base/resolve/main/merges.txt | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/xtune/src/transformers/modeling_xlnet.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/modeling_markuplm.py | https://github.com/pytorch/pytorch/pull/5617 | 源码实现 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/unilm-v1/src/pytorch_pretrained_bert/modeling.py | MarkupLM_ID2781_for_PyTorch/examples/fine_tuning/run_swde/run.py | https://www.github.com/nvidia/apex | 相关依赖 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm.py | https://huggingface.co/microsoft/markuplm-large/resolve/main/merges.txt | 相关说明 | -| 开源代码引入 | https://github.com/microsoft/unilm.git/markuplm/markuplmft/models/markuplm/tokenization_markuplm_fast.py | MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/tokenization_markuplm_fast.py | https://huggingface.co/microsoft/markuplm-base/resolve/main/vocab.json | 相关配置 | - +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/others/MarkupLM_ID2781_for_PyTorch/markuplmft/models/markuplm/modeling_markuplm.py | https://pytorch.org/docs/stable/nn.html#torch.nn.Module | 模型说明链接 | \ No newline at end of file diff --git a/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/public_address_statement.md b/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/public_address_statement.md index e2fd67b166d91ddb6a06c5167bf2d0f9c17bd80e..062b24887452d23073411d78db6588167d943222 100644 --- a/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/public_address_statement.md @@ -1,19 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|------|--------|---------|------------------------|--------| -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-1m.zip | 数据集地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | https://static.preferred.ai/cornac/datasets/movielens/ml_plot.zip | 数据集地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/data/text.py | Vaecf_ID2903_for_PyTorch/cornac/data/text.py | https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/feature_extraction/text.py#L790 | 源码实现 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/metrics/ranking.py | Vaecf_ID2903_for_PyTorch/cornac/metrics/ranking.py | https://en.wikipedia.org/wiki/Mean_reciprocal_rank | 相关说明 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-10m.zip | 数据集地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/utils/init_utils.py | Vaecf_ID2903_for_PyTorch/cornac/utils/init_utils.py | http://www.jmlr.org/proceedings/papers/v9/glorot10a/glorot10a.pdf | 论文地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/metrics/ranking.py | Vaecf_ID2903_for_PyTorch/cornac/metrics/ranking.py | https://en.wikipedia.org/wiki/Discounted_cumulative_gain | 相关说明 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/utils/common.py | Vaecf_ID2903_for_PyTorch/cornac/utils/common.py | https://github.com/scikit-learn/scikit-learn/blob/1495f69242646d239d89a5713982946b8ffcf9d9/sklearn/preprocessing/data.py#L1553 | 源码实现 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | https://grouplens.org/datasets/movielens/ | 数据集地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/data/text.py | Vaecf_ID2903_for_PyTorch/cornac/data/text.py | https://github.com/scikit-learn/scikit-learn/blob/d6d1d63fa6b098c72953a6827aae475f611936ed/sklearn/feature_extraction/text.py#L1451 | 源码实现 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/examples/propensity_stratified_evaluation_example.py | Vaecf_ID2903_for_PyTorch/cornac/eval_methods/propensity_stratified_evaluation.py | https://arxiv.org/abs/2104.08912 | 论文地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/metrics/ranking.py | Vaecf_ID2903_for_PyTorch/cornac/metrics/ranking.py | https://arxiv.org/ftp/arxiv/papers/1205/1205.2618.pdf | 论文地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-100k/u.data | 数据集地址 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/utils/common.py | Vaecf_ID2903_for_PyTorch/cornac/utils/common.py | https://github.com/scikit-learn/scikit-learn/blob/master/sklearn/utils/__init__.py | 源码实现 | -| 开发引入 | / | Vaecf_ID2903_for_PyTorch/cornac/metrics/ranking.py | https://en.wikipedia.org/wiki/Evaluation_measures_ | 相关说明 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://dm.postech.ac.kr/~cartopy/ConvMF/ | 相关说明 | -| 开源代码引入 | https://github.com/PreferredAI/cornac.git/cornac/datasets/movielens.py | Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-20m.zip | 数据集地址 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-20m.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-1m.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://files.grouplens.org/datasets/movielens/ml-10m.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/Vaecf_ID2903_for_PyTorch/cornac/datasets/movielens.py | http://cocodataset.org/ | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/public_address_statement.md b/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/public_address_statement.md index cf861da518889699cd78d1ef0c919af7148a5f18..ebfbb3d9a9b6474fdb182dad24b9983386f7390f 100644 --- a/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/public_address_statement.md +++ b/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/public_address_statement.md @@ -1,31 +1,5 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/amazon.py | https://github.com/maciejkula/recommender_dataset | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/goodbooks.py | https://github.com/zygmuntz/goodbooks-1 | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/amazon.py | https://snap.stanford.edu/data/amazon-meta.ht | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/goodbooks.py | https://github.com/zygmuntz/goodbooks-10 | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/goodbooks.py | https://github.com/zygmuntz/goodbooks-1 | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/movielens.py | https://grouplens.org/datasets/movielen | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/datasets/movielens.py | https://github.com/maciejkula/recommender_dataset | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/examples/movielens_sequence/spotlight/sequence/representations.py | https://github.com/maciejkula/mixtu | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/.travis/docs.sh | git@github.c | 开发者邮箱配置 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/.travis/install.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-MacOSX-x86_64. | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/.travis/install.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64. | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/Makefile | http://sphinx-doc.or | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/amazon.py | https://github.com/maciejkula/recommender_dataset | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/amazon.py | https://snap.stanford.edu/data/amazon-meta.ht | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/goodbooks.py | https://github.com/zygmuntz/goodbooks-1 | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/goodbooks.py | https://github.com/zygmuntz/goodbooks-10 | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/goodbooks.py | https://github.com/zygmuntz/goodbooks-1 | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/movielens.py | https://grouplens.org/datasets/movielen | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/datasets/movielens.py | https://github.com/maciejkula/recommender_dataset | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/spotlight/sequence/representations.py | https://github.com/maciejkula/mixtu | 源码实现 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://www.inkscape.or | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://purl.org/dc/elements/1. | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://creativecommons.org/n | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://www.w3.org/1999/02/22-rdf-syntax-n | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://www.w3.org/2000/s | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://www.w3.org/2000/s | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://sodipodi.sourceforge.net/DTD/sodipodi-0.d | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://www.inkscape.org/namespaces/inksca | 模型相关说明 | -| 开发引入 | / | movielens_sequence_ID2897_for_PyTorch/docs/_static/img/spotlight.svg | http://purl.org/dc/dcmitype/StillIma | 模型相关说明 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/.travis.yml | travis@travis.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/.travis/install.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-MacOSX-x86_64.sh | miniconda链接 | +| ModelZoo-PyTorch/PyTorch/contrib/others/movielens_sequence_ID2897_for_PyTorch/.travis/install.sh | https://repo.continuum.io/miniconda/Miniconda3-latest-Linux-x86_64.sh | miniconda链接 | \ No newline at end of file diff --git a/PyTorch/dev/audio/tacotron2_ID0406_for_PyTorch/public_address_statement.md b/PyTorch/dev/audio/tacotron2_ID0406_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..2f0df47c8d986d101993cfb8739266ba752800d6 --- /dev/null +++ b/PyTorch/dev/audio/tacotron2_ID0406_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|--------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/audio/tacotron2_ID0406_for_PyTorch/scripts/prepare_dataset.sh | http://data.keithito.com/data/speech/$BZ2ARCHIVE | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/3D_attentionnet_ID0478_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/3D_attentionnet_ID0478_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..821caf7327b8738a2deab7222ab147c21df9df25 --- /dev/null +++ b/PyTorch/dev/cv/detection/3D_attentionnet_ID0478_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/3D_attentionnet_ID0478_for_PyTorch/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..b771e07ffe29a3630b2af7cfc70a24711d663997 --- /dev/null +++ b/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/public_address_statement.md @@ -0,0 +1,9 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------|-------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/ChangeNet_ID3663_for_PyTorch/utils_resnet_TL.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..88d864a7612daea3f808c79c4054b3c2667dfc74 --- /dev/null +++ b/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/public_address_statement.md @@ -0,0 +1,8 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/scripts/setup_caffemodels.sh | http://liangchiehchen.com/projects/released/deeplab_aspp_resnet101/prototxt_and_model.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/scripts/setup_cocostuff10k.sh | http://calvin.inf.ed.ac.uk/wp-content/uploads/data/cocostuffdataset/stuffthingmaps_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/scripts/setup_cocostuff164k.sh | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/scripts/setup_cocostuff164k.sh | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/scripts/setup_cocostuff164k.sh | http://calvin.inf.ed.ac.uk/wp-content/uploads/data/cocostuffdataset/stuffthingmaps_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/DeepLabV2_ID1871_for_PyTorch/scripts/setup_voc12.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..276a738b55e2043055b1524cc84cf0f7591ba239 --- /dev/null +++ b/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/public_address_statement.md @@ -0,0 +1,14 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html for details | 说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 设置说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/FasterRCNN-Resnet50-FPN_ID1552_for_PyTorch/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..a535d0c144ef5f1813f29e4b1af8cba4d41d4ddb --- /dev/null +++ b/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/public_address_statement.md @@ -0,0 +1,7 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/open_images_downloader.py | https://storage.googleapis.com/openimages/2018_04/{dataset_type}/{dataset_type}-annotations-bbox.csv | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/open_images_downloader.py | https://storage.googleapis.com/openimages/2018_04/class-descriptions-boxable.csv | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/vision/nn/alexnet.py | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/vision/nn/squeezenet.py | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/MobileNetV3-SSD_ID0408_for_PyTorch/vision/nn/squeezenet.py | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/PointNet_ID0430_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/PointNet_ID0430_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..c7d399a4a77b7e2858a0846c488b7da951a72d21 --- /dev/null +++ b/PyTorch/dev/cv/detection/PointNet_ID0430_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/PointNet_ID0430_for_PyTorch/scripts/download.sh | https://shapenet.cs.stanford.edu/ericyi/shapenetcore_partanno_segmentation_benchmark_v0.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..e9ec782ab1be23baa897a769f3fde4b38d462ff7 --- /dev/null +++ b/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/Retinaface_ID0328_for_PyTorch/models/resnet50.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/YOLOX_Dynamic_ID4069_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/YOLOX_Dynamic_ID4069_for_PyTorch/public_address_statement.md index 1ff1207aaa5b844c8b7b2e54a3301b0ec13b81eb..3506b5f10aef8116e7d35b737afd5054b495d9a7 100644 --- a/PyTorch/dev/cv/detection/YOLOX_Dynamic_ID4069_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/detection/YOLOX_Dynamic_ID4069_for_PyTorch/public_address_statement.md @@ -1,5 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | YOLOX_Dynamic_ID4069_for_PyTorch/1.5_requirements.txt | https://github.com/ppwwyyxx/cocoapi | 相关依赖 | -| 开发引入 | / | YOLOX_Dynamic_ID4069_for_PyTorch/1.8_requirements.txt | https://github.com/ppwwyyxx/cocoapi | 相关依赖 | -| 开发引入 | / | YOLOX_Dynamic_ID4069_for_PyTorch/docs/requirements-doc.txt | https://github.com/sphinx-doc/sphinx/commit/7acd3ada3f38076af7b2b5c9f3b60bb9c2587a3d | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------|------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_Dynamic_ID4069_for_PyTorch/setup.py | https://yolox.readthedocs.io | 相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/public_address_statement.md index 9928c6b761ce5933d3f6fc2b765ab88648d1a23a..9d0b8d898a07207467069877874572c03ef7696a 100644 --- a/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/public_address_statement.md @@ -1,224 +1,610 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|-------------------------------------------------------|------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------|---------| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20200926_125502-20289c16.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201006_021058-421517f1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20200926_125503-8a2c3d47.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_215831-af60cdf9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201122_213640-763cc7b5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201005_113242-b9459f8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201122_104428-99eca4c7.pth | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码| -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_113243-42607475.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.7.0/mmdet/models/backbones/resnest.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/cascade_rcnn.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/cascade_rcnn.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_caffe_fpn_1x_coco/cascade_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.404_20200504_174853-b857be87.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_1x_coco/cascade_rcnn_r50_fpn_1x_coco_20200316-3dc56deb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_20e_coco/cascade_rcnn_r50_fpn_20e_coco_bbox_mAP-0.41_20200504_175131-e9872a90.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_caffe_fpn_1x_coco/cascade_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.423_20200504_175649-cab8dbd5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_1x_coco/cascade_rcnn_r101_fpn_1x_coco_20200317-0b6a2fbf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_20e_coco/cascade_rcnn_r101_fpn_20e_coco_bbox_mAP-0.425_20200504_231812-5057dcc5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_1x_coco/cascade_rcnn_x101_32x4d_fpn_1x_coco_20200316-95c2deb6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_20e_coco/cascade_rcnn_x101_32x4d_fpn_20e_coco_20200906_134608-9ae0a720.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_1x_coco/cascade_rcnn_x101_64x4d_fpn_1x_coco_20200515_075702-43ce6a30.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_20e_coco/cascade_rcnn_x101_64x4d_fpn_20e_coco_20200509_224357-051557b1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_1x_coco/cascade_mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.412__segm_mAP-0.36_20200504_174659-5004b251.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_1x_coco/cascade_mask_rcnn_r50_fpn_1x_coco_20200203-9d4dcb24.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_1x_coco/cascade_mask_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.432__segm_mAP-0.376_20200504_174813-5c1e9599.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_1x_coco/cascade_mask_rcnn_r101_fpn_1x_coco_20200203-befdf6ee.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_20e_coco/cascade_mask_rcnn_r101_fpn_20e_coco_bbox_mAP-0.434__segm_mAP-0.378_20200504_174836-005947da.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco_20200201-0f411b1f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco_20200528_083917-ed1f4751.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco_20200203-9a2db89d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco_20200512_161033-bdb5126a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco_20200512_161033-bdb5126a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210707_002651-6e29b3a6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco_20210628_164719-5bdc3824.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210707_002620-a5bd2389.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco_20210628_165236-51a2d363.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210706_225234-40773067.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210719_180640-9ff7e76f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210719_210311-d3e64ba0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://arxiv.org/abs/1506.01497 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/faster_rcnn.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_1x_coco/faster_rcnn_r50_caffe_c4_1x_coco_20220316_150152-3f885b85.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_mstrain_1x_coco/faster_rcnn_r50_caffe_c4_mstrain_1x_coco_20220316_150527-db276fed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_1x_coco/faster_rcnn_r50_caffe_dc5_1x_coco_20201030_151909-531f0f43.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_1x_coco/faster_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.378_20200504_180032-c5925ee5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/faster_rcnn_r50_fpn_fp16_1x_coco/faster_rcnn_r50_fpn_fp16_1x_coco_20200204-d4dc1471.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_2x_coco/faster_rcnn_r50_fpn_2x_coco_bbox_mAP-0.384_20200504_210434-a5d8aa15.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_1x_coco/faster_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.398_20200504_180057-b269e9dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_1x_coco/faster_rcnn_r101_fpn_1x_coco_20200130-f513f705.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_2x_coco/faster_rcnn_r101_fpn_2x_coco_bbox_mAP-0.398_20200504_210455-1d2dac9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_1x_coco/faster_rcnn_x101_32x4d_fpn_1x_coco_20200203-cff10310.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_2x_coco/faster_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.412_20200506_041400-64a12c0b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_1x_coco/faster_rcnn_x101_64x4d_fpn_1x_coco_20200204-833ee192.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_2x_coco/faster_rcnn_x101_64x4d_fpn_2x_coco_20200512_161033-5961fa95.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_iou_1x_coco/faster_rcnn_r50_fpn_iou_1x_coco_20200506_095954-938e81f0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_giou_1x_coco-0eada910.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_bounded_iou_1x_coco-98ad993b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco_20201028_233851-b33d21b9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco_20201028_002107-34a53b2c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco_bbox_mAP-0.397_20200504_231813-10b2de58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210526_095054-1f77628b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210526_095054-1f77628b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210526_095742-a7ae426d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_mstrain_3x_coco/faster_rcnn_r101_fpn_mstrain_3x_coco_20210524_110822-4d4d2ca8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210524_124151-16b9b260.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210604_182954-002e082a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210524_124528-26c63de6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco_20220320_085147-efedfda4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-t_fpn_1x_coco/retinanet_pvt-t_fpn_1x_coco_20210831_103110-17b566bd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-s_fpn_1x_coco/retinanet_pvt-s_fpn_1x_coco_20210906_142921-b6c94a5b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-m_fpn_1x_coco/retinanet_pvt-m_fpn_1x_coco_20210831_103243-55effa1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b0_fpn_1x_coco/retinanet_pvtv2-b0_fpn_1x_coco_20210831_103157-13e9aabe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b1_fpn_1x_coco/retinanet_pvtv2-b1_fpn_1x_coco_20210831_103318-7e169a7d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b2_fpn_1x_coco/retinanet_pvtv2-b2_fpn_1x_coco_20210901_174843-529f0b9a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b3_fpn_1x_coco/retinanet_pvtv2-b3_fpn_1x_coco_20210903_151512-8357deff.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b4_fpn_1x_coco/retinanet_pvtv2-b4_fpn_1x_coco_20210901_170151-83795c86.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b5_fpn_1x_coco/retinanet_pvtv2-b5_fpn_1x_coco_20210902_201800-3420eb57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 参考文献 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.17.0/mmdet/models/backbones/pvt.py | 下载源码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/mask_rcnn.py#L6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml |https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco/mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.38__segm_mAP-0.344_20200504_231812-0ebd1859.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_1x_coco/mask_rcnn_r50_fpn_fp16_1x_coco_20200205-59faf7e4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_1x_coco/mask_rcnn_r101_caffe_fpn_1x_coco_20200601_095758-805e06c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_1x_coco/mask_rcnn_r101_fpn_1x_coco_20200204-1efe0ed5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_2x_coco/mask_rcnn_r101_fpn_2x_coco_bbox_mAP-0.408__segm_mAP-0.366_20200505_071027-14b391c7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_1x_coco/mask_rcnn_x101_32x4d_fpn_1x_coco_20200205-478d0b67.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_2x_coco/mask_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.422__segm_mAP-0.378_20200506_004702-faef898c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_1x_coco/mask_rcnn_x101_64x4d_fpn_1x_coco_20200201-9352eb0d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_2x_coco/mask_rcnn_x101_64x4d_fpn_2x_coco_20200509_224208-39d6f70c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_1x_coco/mask_rcnn_x101_32x8d_fpn_1x_coco_20220630_173841-0aaf329e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco_bbox_mAP-0.403__segm_mAP-0.365_20200504_231822-a75c98ce.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_fpn_mstrain-poly_3x_coco_20210524_201154-21b550bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_fpn_mstrain-poly_3x_coco_20210524_200244-5675c317.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco_20210526_132339-3c33ce02.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco_20210524_201410-abcd7859.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_1x_coco/mask_rcnn_x101_32x8d_fpn_mstrain-poly_1x_coco_20220630_170346-b4637974.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco_20210607_161042-8bd2c639.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco_20210526_120447-c376f129.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://arxiv.org/abs/1904.11492 | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/ops/context_block.py | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco_20200515_211915-187da160.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco_20200204-17235656.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco_20200205-e58ae947.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco_20200206-af22dc9d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco_20200202-bb3eb55c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200202-587b99aa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200202-50b90e5c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco_20200210-81658c8a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200207-945e77ca.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200206-8407a3f0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200211-7584841c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-cbed3d2c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200212-68164964.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200310-d5ad2a5e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-10bf2463.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200703_180653-ed035291.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco_20210615_211019-abbc39ea.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco_20210615_215648-44aa598a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco_20210615_161851-720338ec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://arxiv.org/abs/1708.02002 | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/detectors/retinanet.py#L6 | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x_coco/retinanet_r18_fpn_1x_coco_20220407_171055-614fd399.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x8_1x_coco/retinanet_r18_fpn_1x8_1x_coco_20220407_171255-4ea310d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_caffe_fpn_1x_coco/retinanet_r50_caffe_fpn_1x_coco_20200531-f11027c5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_1x_coco/retinanet_r50_fpn_1x_coco_20200130-c2398f9e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/retinanet_r50_fpn_fp16_1x_coco/retinanet_r50_fpn_fp16_1x_coco_20200702-0dbfb212.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_2x_coco/retinanet_r50_fpn_2x_coco_20200131-fdb43119.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_mstrain_3x_coco/retinanet_r50_fpn_mstrain_3x_coco_20210718_220633-88476508.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_1x_coco/retinanet_r101_caffe_fpn_1x_coco_20200531-b428fa0f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_mstrain_3x_coco/retinanet_r101_caffe_fpn_mstrain_3x_coco_20210721_063439-88a8a944.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_1x_coco/retinanet_r101_fpn_1x_coco_20200130-7a93545f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_2x_coco/retinanet_r101_fpn_2x_coco_20200131-5560aee8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_mstrain_3x_coco/retinanet_r101_fpn_mstrain_3x_coco_20210720_214650-7ee888e0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_1x_coco/retinanet_x101_32x4d_fpn_1x_coco_20200130-5c8b7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_2x_coco/retinanet_x101_32x4d_fpn_2x_coco_20200131-237fc5e1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_1x_coco/retinanet_x101_64x4d_fpn_1x_coco_20200130-366f5af1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_2x_coco/retinanet_x101_64x4d_fpn_2x_coco_20200131-bca068ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection/blob/master/configs/yolox | YOLOX_ID2833_for_PyTorch/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_mstrain_3x_coco/retinanet_x101_64x4d_fpn_mstrain_3x_coco_20210719_051838-022c2187.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_1x_coco/faster_rcnn_hrnetv2p_w18_1x_coco_20200130-56651a6d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_2x_coco/faster_rcnn_hrnetv2p_w18_2x_coco_20200702_085731-a4ec0611.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_1x_coco/faster_rcnn_hrnetv2p_w32_1x_coco_20200130-6e286425.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_2x_coco/faster_rcnn_hrnetv2p_w32_2x_coco_20200529_015927-976a9c15.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_1x_coco/faster_rcnn_hrnetv2p_w40_1x_coco_20200210-95c1f5ce.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_2x_coco/faster_rcnn_hrnetv2p_w40_2x_coco_20200512_161033-0f236ef4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_1x_coco/mask_rcnn_hrnetv2p_w18_1x_coco_20200205-1c3d78ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_2x_coco/mask_rcnn_hrnetv2p_w18_2x_coco_20200212-b3c825b1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_1x_coco/mask_rcnn_hrnetv2p_w32_1x_coco_20200207-b29f616e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_2x_coco/mask_rcnn_hrnetv2p_w32_2x_coco_20200213-45b75b4d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco/mask_rcnn_hrnetv2p_w40_1x_coco_20200511_015646-66738b35.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_2x_coco/mask_rcnn_hrnetv2p_w40_2x_coco_20200512_163732-aed5e4ab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w18_20e_coco/cascade_rcnn_hrnetv2p_w18_20e_coco_20200210-434be9d7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w32_20e_coco/cascade_rcnn_hrnetv2p_w32_20e_coco_20200208-928455a4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w40_20e_coco/cascade_rcnn_hrnetv2p_w40_20e_coco_20200512_161112-75e47b04.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w18_20e_coco/cascade_mask_rcnn_hrnetv2p_w18_20e_coco_20200210-b543cd2b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git| YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 下载论文 | -| 开源代码引入 | https://github.com/open-mmlab/mmdetection.git | YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://github.com/open-mmlab/mmdetection/blob/v2.0.0/mmdet/models/backbones/hrnet.py#L195 | 下载代码 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/atss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/atss/atss_r50_fpn_1x_coco/atss_r50_fpn_1x_coco_20200209-985f7bd0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/atss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/atss/atss_r101_fpn_1x_coco/atss_r101_fpn_1x_20200825-dfcadd6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/atss/metafile.yml | https://arxiv.org/abs/1912.02424 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/autoassign/metafile.yml | https://arxiv.org/abs/2007.03496 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/autoassign/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/autoassign/auto_assign_r50_fpn_1x_coco/auto_assign_r50_fpn_1x_coco_20210413_115540-5e17991f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/carafe/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/carafe/mask_rcnn_r50_fpn_carafe_1x_coco/mask_rcnn_r50_fpn_carafe_1x_coco_bbox_mAP-0.393__segm_mAP-0.358_20200503_135957-8687f195.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/carafe/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/carafe/faster_rcnn_r50_fpn_carafe_1x_coco/faster_rcnn_r50_fpn_carafe_1x_coco_bbox_mAP-0.386_20200504_175733-385a75b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/carafe/metafile.yml | https://arxiv.org/abs/1905.02188 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_20e_coco/cascade_rcnn_x101_64x4d_fpn_20e_coco_20200509_224357-051557b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_1x_coco/cascade_rcnn_x101_64x4d_fpn_1x_coco_20200515_075702-43ce6a30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_20e_coco/cascade_rcnn_x101_32x4d_fpn_20e_coco_20200906_134608-9ae0a720.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_1x_coco/cascade_rcnn_x101_32x4d_fpn_1x_coco_20200316-95c2deb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_20e_coco/cascade_rcnn_r50_fpn_20e_coco_bbox_mAP-0.41_20200504_175131-e9872a90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_1x_coco/cascade_rcnn_r50_fpn_1x_coco_20200316-3dc56deb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_caffe_fpn_1x_coco/cascade_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.404_20200504_174853-b857be87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_20e_coco/cascade_rcnn_r101_fpn_20e_coco_bbox_mAP-0.425_20200504_231812-5057dcc5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_1x_coco/cascade_rcnn_r101_fpn_1x_coco_20200317-0b6a2fbf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_caffe_fpn_1x_coco/cascade_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.423_20200504_175649-cab8dbd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210719_210311-d3e64ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco_20200512_161033-bdb5126a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco_20200203-9a2db89d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210719_180640-9ff7e76f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco/cascade_mask_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210706_225234-40773067.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco_20200528_083917-ed1f4751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco_20200201-0f411b1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_fpn_mstrain_3x_coco_20210628_164719-5bdc3824.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_1x_coco/cascade_mask_rcnn_r50_fpn_1x_coco_20200203-9d4dcb24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210707_002651-6e29b3a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_1x_coco/cascade_mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.412__segm_mAP-0.36_20200504_174659-5004b251.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_fpn_mstrain_3x_coco_20210628_165236-51a2d363.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_20e_coco/cascade_mask_rcnn_r101_fpn_20e_coco_bbox_mAP-0.434__segm_mAP-0.378_20200504_174836-005947da.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_1x_coco/cascade_mask_rcnn_r101_fpn_1x_coco_20200203-befdf6ee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco/cascade_mask_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210707_002620-a5bd2389.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_1x_coco/cascade_mask_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.432__segm_mAP-0.376_20200504_174813-5c1e9599.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://dx.doi.org/10.1109/tpami.2019.2956516 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rpn/crpn_faster_rcnn_r50_caffe_fpn_1x_coco/crpn_faster_rcnn_r50_caffe_fpn_1x_coco-c8283cca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cascade_rpn/crpn_fast_rcnn_r50_caffe_fpn_1x_coco/crpn_fast_rcnn_r50_caffe_fpn_1x_coco-cb486e66.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cascade_rpn/metafile.yml | https://arxiv.org/abs/1909.06720 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/centernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_dcnv2_140e_coco/centernet_resnet18_dcnv2_140e_coco_20210702_155131-c8cd631f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/centernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_140e_coco/centernet_resnet18_140e_coco_20210705_093630-bb5b3bf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/centernet/metafile.yml | https://arxiv.org/abs/1904.07850 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/centripetalnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/centripetalnet/centripetalnet_hourglass104_mstest_16x6_210e_coco/centripetalnet_hourglass104_mstest_16x6_210e_coco_20200915_204804-3ccc61e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/centripetalnet/metafile.yml | https://arxiv.org/abs/2003.09119 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-small_3rdparty_32xb128-noema_in1k_20220301-303e75e3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco.py | https://download.openmmlab.com/mmclassification/v0/convnext/downstream/convnext-tiny_3rdparty_32xb128-noema_in1k_20220301-795e9634.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/metafile.yml | https://arxiv.org/abs/2201.03545 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco/mask_rcnn_convnext-t_p4_w7_fpn_fp16_ms-crop_3x_coco_20220426_154953-050731f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco/cascade_mask_rcnn_convnext-t_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco_20220509_204200-8f07c40b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/convnext/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/convnext/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco/cascade_mask_rcnn_convnext-s_p4_w7_fpn_giou_4conv1f_fp16_ms-crop_3x_coco_20220510_201004-3d24f5a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_8x6_210e_coco/cornernet_hourglass104_mstest_8x6_210e_coco_20200825_150618-79b44c30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_32x3_210e_coco/cornernet_hourglass104_mstest_32x3_210e_coco_20200819_203110-1efaea91.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cornernet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_10x5_210e_coco/cornernet_hourglass104_mstest_10x5_210e_coco_20200824_185720-5fefbf1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/cornernet/metafile.yml | https://arxiv.org/abs/1808.01244 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_fp16_dconv_c3-c5_1x_coco_20210520_180247-c06429d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200203-4d9ad43b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200216-a71f5bce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco_20200203-4f85c69c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dpool_1x_coco/faster_rcnn_r50_fpn_dpool_1x_coco_20200307-90d3c01d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-d68aed1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-1377f13d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-2f1fca44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-3b2f0594.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco-e75f90c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200202-42e767a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200204-df0c5f10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcn/metafile.yml | https://arxiv.org/abs/1703.06211 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_fp16_mdconv_c3-c5_1x_coco_20210520_180434-cf8fefa5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200203-ad97591f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdpool_1x_coco/faster_rcnn_r50_fpn_mdpool_1x_coco_20200307-c0df27ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco_20200130-01262257.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcnv2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200130-d099253b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dcnv2/metafile.yml | https://arxiv.org/abs/1811.11168 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ddod/metafile.yml | https://arxiv.org/pdf/2107.02963.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ddod/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ddod/ddod_r50_fpn_1x_coco/ddod_r50_fpn_1x_coco_20220523_223737-29b2fc67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_twostage_refine_r50_16x2_50e_coco/deformable_detr_twostage_refine_r50_16x2_50e_coco_20210419_220613-9d28ab72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_refine_r50_16x2_50e_coco/deformable_detr_refine_r50_16x2_50e_coco_20210419_220503-5f5dff21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_r50_16x2_50e_coco/deformable_detr_r50_16x2_50e_coco_20210419_220030-a12b9512.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/deformable_detr/metafile.yml | https://openreview.net/forum?id=gZ9hCDWe6ke | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://arxiv.org/abs/2006.02334 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_sac_1x_coco/htc_r50_sac_1x_coco-bfa60c54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_rfp_1x_coco/htc_r50_rfp_1x_coco-8ff87c51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_htc_r50_1x_coco/detectors_htc_r50_1x_coco-329b1453.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_cascade_rcnn_r50_1x_coco/detectors_cascade_rcnn_r50_1x_coco-32a10ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_sac_1x_coco/cascade_rcnn_r50_sac_1x_coco-24bfda62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detectors/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_rfp_1x_coco/cascade_rcnn_r50_rfp_1x_coco-8cf51bfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detr/metafile.yml | https://arxiv.org/abs/2005.12872 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/detr/detr_r50_8x2_150e_coco/detr_r50_8x2_150e_coco_20201130_194835-2c4b8974.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/double_heads/metafile.yml | https://arxiv.org/pdf/1904.06493 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/double_heads/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/double_heads/dh_faster_rcnn_r50_fpn_1x_coco/dh_faster_rcnn_r50_fpn_1x_coco_20200130-586b67df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dyhead/metafile.yml | https://arxiv.org/abs/2106.08322 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco/atss_swin-l-p4-w12_fpn_dyhead_mstrain_2x_coco_20220509_100315-bc5b6516.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_r50_fpn_dyhead_for_reproduction_1x_coco/atss_r50_fpn_dyhead_for_reproduction_4x4_1x_coco_20220107_213939-162888e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dyhead/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dyhead/atss_r50_fpn_dyhead_4x4_1x_coco/atss_r50_fpn_dyhead_4x4_1x_coco_20211219_023314-eaa620c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://arxiv.org/pdf/2004.06002 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/dynamic_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/dynamic_rcnn/dynamic_rcnn_r50_fpn_1x/dynamic_rcnn_r50_fpn_1x-62a3f276.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/efficientnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco/retinanet_effb3_fpn_crop896_8x4_1x_coco_20220322_234806-615a0dda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/efficientnet/metafile.yml | https://arxiv.org/abs/1905.11946v5 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/efficientnet/retinanet_effb3_fpn_crop896_8x4_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/efficientnet/efficientnet-b3_3rdparty_8xb32-aa_in1k_20220119-5b4887a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/empirical_attention/metafile.yml | https://arxiv.org/pdf/1904.05873 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco_20200130-8b2523a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_1x_coco/faster_rcnn_r50_fpn_attention_1111_1x_coco_20200130-403cccba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco_20200130-1a2e831d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/empirical_attention/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_1x_coco/faster_rcnn_r50_fpn_attention_0010_1x_coco_20200130-7cb0c14d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco.py | https://download.pytorch.org/models/resnet50-11ad3fa6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/faster_rcnn_r50_fpn_fp16_1x_coco/faster_rcnn_r50_fpn_fp16_1x_coco_20200204-d4dc1471.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_64x4d_fpn_mstrain_3x_coco_20210524_124528-26c63de6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_2x_coco/faster_rcnn_x101_64x4d_fpn_2x_coco_20200512_161033-5961fa95.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_1x_coco/faster_rcnn_x101_64x4d_fpn_1x_coco_20200204-833ee192.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x8d_fpn_mstrain_3x_coco_20210604_182954-002e082a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco/faster_rcnn_x101_32x4d_fpn_mstrain_3x_coco_20210524_124151-16b9b260.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_2x_coco/faster_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.412_20200506_041400-64a12c0b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_1x_coco/faster_rcnn_x101_32x4d_fpn_1x_coco_20200203-cff10310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco/faster_rcnn_r50_fpn_tnr-pretrain_1x_coco_20220320_085147-efedfda4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_mstrain_3x_coco/faster_rcnn_r50_fpn_mstrain_3x_coco_20210524_110822-e10bd31c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_iou_1x_coco/faster_rcnn_r50_fpn_iou_1x_coco_20200506_095954-938e81f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_2x_coco/faster_rcnn_r50_fpn_2x_coco_bbox_mAP-0.384_20200504_210434-a5d8aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_giou_1x_coco-0eada910.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_bounded_iou_1x_coco-98ad993b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_20210526_095054-1f77628b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco_bbox_mAP-0.397_20200504_231813-10b2de58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_1x_coco/faster_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.378_20200504_180032-c5925ee5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco_20201028_002107-34a53b2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco_20201028_233851-b33d21b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_1x_coco/faster_rcnn_r50_caffe_dc5_1x_coco_20201030_151909-531f0f43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_mstrain_1x_coco/faster_rcnn_r50_caffe_c4_mstrain_1x_coco_20220316_150527-db276fed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_c4_1x_coco/faster_rcnn_r50_caffe_c4_1x_coco_20220316_150152-3f885b85.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_mstrain_3x_coco/faster_rcnn_r101_fpn_mstrain_3x_coco_20210524_110822-4d4d2ca8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_2x_coco/faster_rcnn_r101_fpn_2x_coco_bbox_mAP-0.398_20200504_210455-1d2dac9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_1x_coco/faster_rcnn_r101_fpn_1x_coco_20200130-f513f705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco/faster_rcnn_r101_caffe_fpn_mstrain_3x_coco_20210526_095742-a7ae426d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_1x_coco/faster_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.398_20200504_180057-b269e9dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/faster_rcnn/metafile.yml | https://arxiv.org/abs/1506.01497 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco-ede514a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco-d92ceeea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_1x_coco/fcos_r50_caffe_fpn_gn-head_1x_coco-821213aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco-511424d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_1x_coco/fcos_r101_caffe_fpn_gn-head_1x_coco-0e37b982.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco-ae4d8b3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco-0a0d75a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fcos/metafile.yml | https://arxiv.org/abs/1904.01355 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_2x_coco/fovea_r50_fpn_4x4_2x_coco_20200203-2df792b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_1x_coco/fovea_r50_fpn_4x4_1x_coco_20200219-ee4d5303.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_2x_coco/fovea_r101_fpn_4x4_2x_coco_20200208-02320ea4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_1x_coco/fovea_r101_fpn_4x4_1x_coco_20200219-05e38f1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200205-85ce26cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_4x4_2x_coco/fovea_align_r50_fpn_gn-head_4x4_2x_coco_20200203-8987880d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200208-649c5eb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_4x4_2x_coco/fovea_align_r101_fpn_gn-head_4x4_2x_coco_20200208-c39a027a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/foveabox/metafile.yml | https://arxiv.org/abs/1904.03797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://arxiv.org/abs/2004.03580 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg_crop640_50e_coco/retinanet_r50_fpg_crop640_50e_coco_20220311_110809-b0bcf5f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg-chn128_crop640_50e_coco/mask_rcnn_r50_fpg-chn128_crop640_50e_coco_20220311_011859-043c9b4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg_crop640_50e_coco/mask_rcnn_r50_fpg_crop640_50e_coco_20220311_011857-233b8334.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco/faster_rcnn_r50_fpg-chn128_crop640_50e_coco_20220311_011857-9376aa9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg_crop640_50e_coco/faster_rcnn_r50_fpg_crop640_50e_coco_20220311_011856-74109f42.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fpg/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg-chn128_crop640_50e_coco/retinanet_r50_fpg-chn128_crop640_50e_coco_20220313_104829-ee99a686.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_x101_32x4d_fpn_1x_coco/retinanet_free_anchor_x101_32x4d_fpn_1x_coco_20200130-d4846968.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r50_fpn_1x_coco/retinanet_free_anchor_r50_fpn_1x_coco_20200130-0f67375f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/free_anchor/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r101_fpn_1x_coco/retinanet_free_anchor_r101_fpn_1x_coco_20200130-358324e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/free_anchor/metafile.yml | https://arxiv.org/abs/1909.02466 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_x101_64x4d_fpn_1x_coco/fsaf_x101_64x4d_fpn_1x_coco-e3f6e6fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r50_fpn_1x_coco/fsaf_r50_fpn_1x_coco-94ccc51f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fsaf/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r101_fpn_1x_coco/fsaf_r101_fpn_1x_coco-9e71098f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/fsaf/metafile.yml | https://arxiv.org/abs/1903.00621 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200212-68164964.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-cbed3d2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200211-7584841c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200202-50b90e5c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200202-587b99aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco_20200202-bb3eb55c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco_20200204-17235656.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco_20200515_211915-187da160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200206-8407a3f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200207-945e77ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco_20200210-81658c8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco_20200206-af22dc9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco_20200205-e58ae947.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200703_180653-ed035291.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-10bf2463.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco_20210615_161851-720338ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco_20210615_215648-44aa598a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco_20210615_211019-abbc39ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200310-d5ad2a5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gcnet/metafile.yml | https://arxiv.org/abs/1904.11492 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://arxiv.org/abs/2006.04388 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_mstrain_2x_coco/gfl_x101_32x4d_fpn_mstrain_2x_coco_20200630_102002-50c1ffdb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco_20200630_102002-14a2bf25.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_mstrain_2x_coco/gfl_r50_fpn_mstrain_2x_coco_20200629_213802-37bb1edc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_1x_coco/gfl_r50_fpn_1x_coco_20200629_121244-25944287.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gfl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_64x4d_fpn_1x_coco/retinanet_ghm_x101_64x4d_fpn_1x_coco_20200131-dd381cef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_32x4d_fpn_1x_coco/retinanet_ghm_x101_32x4d_fpn_1x_coco_20200131-e4333bd0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r50_fpn_1x_coco/retinanet_ghm_r50_fpn_1x_coco_20200130-a437fda3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ghm/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r101_fpn_1x_coco/retinanet_ghm_r101_fpn_1x_coco_20200130-c148ee8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ghm/metafile.yml | https://arxiv.org/abs/1811.05181 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco_20200225-542aefbc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco_20200207-20d3e849.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_3x_coco/mask_rcnn_r50_fpn_gn-all_3x_coco_20200214-8b23b1e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_2x_coco/mask_rcnn_r50_fpn_gn-all_2x_coco_20200206-8eee02a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_3x_coco/mask_rcnn_r101_fpn_gn-all_3x_coco_20200513_181609-0df864f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_2x_coco/mask_rcnn_r101_fpn_gn-all_2x_coco_20200205-d96b1b50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn/metafile.yml | https://arxiv.org/abs/1803.08494 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco_20200216-649fdb6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200226-969bcb2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco_20200319-33fb95b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200316-e6cd35ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_2x_coco/mask_rcnn_r50_fpn_gn_ws-all_2x_coco_20200226-16acb762.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco_20200213-487d1283.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_2x_coco/mask_rcnn_r101_fpn_gn_ws-all_2x_coco_20200212-ea357cd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco_20200213-57b5a50f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco_20200203-839c5d9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco_20200212-27da1bc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r50_fpn_gn_ws-all_1x_coco/faster_rcnn_r50_fpn_gn_ws-all_1x_coco_20200130-613d9fe2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r101_fpn_gn_ws-all_1x_coco/faster_rcnn_r101_fpn_gn_ws-all_1x_coco_20200205-a93b0d75.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/gn+ws/metafile.yml | https://arxiv.org/abs/1903.10520 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco_20200204-ec76a754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco_20200130-d8f0e3ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r50_fpn_gn-head_2x_coco/grid_rcnn_r50_fpn_gn-head_2x_coco_20200130-6cca8223.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/grid_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r101_fpn_gn-head_2x_coco/grid_rcnn_r101_fpn_gn-head_2x_coco_20200309-d6eca030.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/grid_rcnn/metafile.yml | https://arxiv.org/abs/1906.05688 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/groie/metafile.yml | https://arxiv.org/abs/2004.13665 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200604_211715-42eb79e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_groie_1x_coco/mask_rcnn_r50_fpn_groie_1x_coco_20200604_211715-50d90c74.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200607_224507-8daae01c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/grid_rcnn_r50_fpn_gn-head_groie_1x_coco/grid_rcnn_r50_fpn_gn-head_groie_1x_coco_20200605_202059-4b75d86f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/groie/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/groie/faster_rcnn_r50_fpn_groie_1x_coco/faster_rcnn_r50_fpn_groie_1x_coco_20200604_211715-66ee9516.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_x101_64x4d_fpn_1x_coco/ga_rpn_x101_64x4d_fpn_1x_coco_20200225-3c6e1aa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_x101_32x4d_fpn_1x_coco/ga_rpn_x101_32x4d_fpn_1x_coco_20200220-c28d1b18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_r50_caffe_fpn_1x_coco/ga_rpn_r50_caffe_fpn_1x_coco_20200531-899008a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_rpn_r101_caffe_fpn_1x_coco/ga_rpn_r101_caffe_fpn_1x_coco_20200531-ca9ba8fb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_64x4d_fpn_1x_coco/ga_retinanet_x101_64x4d_fpn_1x_coco_20200226-ef9f7f1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_32x4d_fpn_1x_coco/ga_retinanet_x101_32x4d_fpn_1x_coco_20200219-40c56caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r50_caffe_fpn_1x_coco/ga_retinanet_r50_caffe_fpn_1x_coco_20201020-39581c6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r101_caffe_fpn_1x_coco/ga_retinanet_r101_caffe_fpn_1x_coco_20200531-6266453c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_64x4d_fpn_1x_coco/ga_faster_x101_64x4d_fpn_1x_coco_20200215-0fa7bde7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_32x4d_fpn_1x_coco/ga_faster_x101_32x4d_fpn_1x_coco_20200215-1ded9da3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r50_caffe_fpn_1x_coco/ga_faster_r50_caffe_fpn_1x_coco_20200702_000718-a11ccfe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r101_caffe_fpn_1x_coco/ga_faster_r101_caffe_fpn_1x_coco_bbox_mAP-0.415_20200505_115528-fb82e499.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/guided_anchoring/metafile.yml | https://arxiv.org/abs/1901.03278 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://arxiv.org/abs/1904.04514 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_2x_coco/mask_rcnn_hrnetv2p_w40_2x_coco_20200512_163732-aed5e4ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco/mask_rcnn_hrnetv2p_w40_1x_coco_20200511_015646-66738b35.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_2x_coco/mask_rcnn_hrnetv2p_w32_2x_coco_20200213-45b75b4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_1x_coco/mask_rcnn_hrnetv2p_w32_1x_coco_20200207-b29f616e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_2x_coco/mask_rcnn_hrnetv2p_w18_2x_coco_20200212-b3c825b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_1x_coco/mask_rcnn_hrnetv2p_w18_1x_coco_20200205-1c3d78ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w40_20e_coco/htc_hrnetv2p_w40_20e_coco_20200529_183411-417c4d5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w32_20e_coco/htc_hrnetv2p_w32_20e_coco_20200207-7639fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w18_20e_coco/htc_hrnetv2p_w18_20e_coco_20200210-b266988c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco_20201212_124752-f22d2ce5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco_20201212_090846-b6f2b49f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco_20201212_112133-77b6b9bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco_20201211_134730-cb8055c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco_20201212_111651-441e9d9f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco_20201212_101110-5c575fa5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco_20201212_100710-4ad151de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_2x_coco/faster_rcnn_hrnetv2p_w40_2x_coco_20200512_161033-0f236ef4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_1x_coco/faster_rcnn_hrnetv2p_w40_1x_coco_20200210-95c1f5ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_2x_coco/faster_rcnn_hrnetv2p_w32_2x_coco_20200529_015927-976a9c15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_1x_coco/faster_rcnn_hrnetv2p_w32_1x_coco_20200130-6e286425.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_2x_coco/faster_rcnn_hrnetv2p_w18_2x_coco_20200702_085731-a4ec0611.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_1x_coco/faster_rcnn_hrnetv2p_w18_1x_coco_20200130-56651a6d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w40_20e_coco/cascade_rcnn_hrnetv2p_w40_20e_coco_20200512_161112-75e47b04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w32_20e_coco/cascade_rcnn_hrnetv2p_w32_20e_coco_20200208-928455a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w18_20e_coco/cascade_rcnn_hrnetv2p_w18_20e_coco_20200210-434be9d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w40_20e_coco/cascade_mask_rcnn_hrnetv2p_w40_20e_coco_20200527_204922-969c4610.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w32_20e_coco/cascade_mask_rcnn_hrnetv2p_w32_20e_coco_20200512_154043-39d9cf7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w18_20e_coco/cascade_mask_rcnn_hrnetv2p_w18_20e_coco_20200210-b543cd2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco_20200312-946fd751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_16x1_20e_coco/htc_x101_64x4d_fpn_16x1_20e_coco_20200318-b181fd7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_32x4d_fpn_16x1_20e_coco/htc_x101_32x4d_fpn_16x1_20e_coco_20200318-de97ae01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_20e_coco/htc_r50_fpn_20e_coco_20200319-fe28c577.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_1x_coco/htc_r50_fpn_1x_coco_20200317-7332cf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/htc/htc_r101_fpn_20e_coco/htc_r101_fpn_20e_coco_20200317-9b41b48f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/htc/metafile.yml | https://arxiv.org/abs/1901.07518 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco_20200515_080947-8ed58c1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r50_fpn_instaboost_4x_coco/mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-d025f83a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r101_fpn_instaboost_4x_coco/mask_rcnn_r101_fpn_instaboost_4x_coco_20200703_235738-f23f3a5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/instaboost/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/instaboost/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-c19d98d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/instaboost/metafile.yml | https://arxiv.org/abs/1908.07801 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/lad/lad_r101_paa_r50_fpn_coco_1x.py | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/lad/lad_r50_paa_r101_fpn_coco_1x.py | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/lad/metafile.yml | https://arxiv.org/abs/2108.10520 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/lad/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/lad/lad_r50_paa_r101_fpn_coco_1x/lad_r50_paa_r101_fpn_coco_1x_20220708_124246-74c76ff0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/lad/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/lad/lad_r101_paa_r50_fpn_coco_1x/lad_r101_paa_r50_fpn_coco_1x_20220708_124357-9407ac54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/ld_r101_gflv1_r101dcn_fpn_coco_2x.py | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/ld_r18_gflv1_r101_fpn_coco_1x.py | https://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/metafile.yml | https://arxiv.org/abs/2102.12252 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ld/ld_r50_gflv1_r101_fpn_coco_1x/ld_r50_gflv1_r101_fpn_coco_1x_20220629_145355-8dc5bad8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ld/ld_r34_gflv1_r101_fpn_coco_1x/ld_r34_gflv1_r101_fpn_coco_1x_20220630_134007-9bc69413.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ld/ld_r18_gflv1_r101_fpn_coco_1x/ld_r18_gflv1_r101_fpn_coco_1x_20220702_062206-330e6332.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ld/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ld/ld_r101_gflv1_r101dcn_fpn_coco_2x/ld_r101_gflv1_r101dcn_fpn_coco_2x_20220629_185920-9e658426.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_retinanet_r50_fpn_1x_coco/libra_retinanet_r50_fpn_1x_coco_20200205-804d94ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_x101_64x4d_fpn_1x_coco/libra_faster_rcnn_x101_64x4d_fpn_1x_coco_20200315-3a7d0488.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r50_fpn_1x_coco/libra_faster_rcnn_r50_fpn_1x_coco_20200130-3afee3a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/libra_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r101_fpn_1x_coco/libra_faster_rcnn_r101_fpn_1x_coco_20200203-8dba6a5a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/libra_rcnn/metafile.yml | https://arxiv.org/abs/1904.02701 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_64x4d_fpn_mstrain-poly_3x_coco_20210526_120447-c376f129.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_2x_coco/mask_rcnn_x101_64x4d_fpn_2x_coco_20200509_224208-39d6f70c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_1x_coco/mask_rcnn_x101_64x4d_fpn_1x_coco_20200201-9352eb0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x8d_fpn_mstrain-poly_3x_coco_20210607_161042-8bd2c639.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_mstrain-poly_1x_coco/mask_rcnn_x101_32x8d_fpn_mstrain-poly_1x_coco_20220630_170346-b4637974.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x8d_fpn_1x_coco/mask_rcnn_x101_32x8d_fpn_1x_coco_20220630_173841-0aaf329e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco/mask_rcnn_x101_32x4d_fpn_mstrain-poly_3x_coco_20210524_201410-abcd7859.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_2x_coco/mask_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.422__segm_mAP-0.378_20200506_004702-faef898c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_1x_coco/mask_rcnn_x101_32x4d_fpn_1x_coco_20200205-478d0b67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_fpn_mstrain-poly_3x_coco_20210524_201154-21b550bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco_bbox_mAP-0.403__segm_mAP-0.365_20200504_231822-a75c98ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco/mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.38__segm_mAP-0.344_20200504_231812-0ebd1859.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_fpn_mstrain-poly_3x_coco_20210524_200244-5675c317.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_2x_coco/mask_rcnn_r101_fpn_2x_coco_bbox_mAP-0.408__segm_mAP-0.366_20200505_071027-14b391c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_1x_coco/mask_rcnn_r101_fpn_1x_coco_20200204-1efe0ed5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r101_caffe_fpn_mstrain-poly_3x_coco_20210526_132339-3c33ce02.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_1x_coco/mask_rcnn_r101_caffe_fpn_1x_coco_20200601_095758-805e06c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_1x_coco/mask_rcnn_r50_fpn_fp16_1x_coco_20200205-59faf7e4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask_rcnn/metafile.yml | https://arxiv.org/abs/1703.06870v3 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://arxiv.org/pdf/2112.01527 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco-panoptic_20220326_224553-fc567107.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco/mask2former_swin-t-p4-w7-224_lsj_8x2_50e_coco_20220508_091649-4a943037.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco-panoptic_20220329_225200-c7b94355.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco/mask2former_swin-s-p4-w7-224_lsj_8x2_50e_coco_20220504_001756-743b7d99.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic/mask2former_swin-l-p4-w12-384-in21k_lsj_16x1_100e_coco-panoptic_20220407_104949-d4919c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic/mask2former_swin-b-p4-w12-384-in21k_lsj_8x2_50e_coco-panoptic_20220329_230021-3bb8b482.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic/mask2former_swin-b-p4-w12-384_lsj_8x2_50e_coco-panoptic_20220331_002244-c149a9e9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r50_lsj_8x2_50e_coco-panoptic/mask2former_r50_lsj_8x2_50e_coco-panoptic_20220326_224516-11a44721.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r50_lsj_8x2_50e_coco/mask2former_r50_lsj_8x2_50e_coco_20220506_191028-8e96e88b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r101_lsj_8x2_50e_coco-panoptic/mask2former_r101_lsj_8x2_50e_coco-panoptic_20220329_225104-c54e64c9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/mask2former/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/mask2former/mask2former_r101_lsj_8x2_50e_coco/mask2former_r101_lsj_8x2_50e_coco_20220426_100250-c50b6fa6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/maskformer/metafile.yml | https://arxiv.org/pdf/2107.06278 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/maskformer/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/maskformer/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco/maskformer_swin-l-p4-w12_mstrain_64x1_300e_coco_20220326_221612-061b4eb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/maskformer/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/maskformer/maskformer_r50_mstrain_16x1_75e_coco/maskformer_r50_mstrain_16x1_75e_coco_20220221_141956-bc2699cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_2x_coco/ms_rcnn_x101_64x4d_fpn_2x_coco_20200308-02a445e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_1x_coco/ms_rcnn_x101_64x4d_fpn_1x_coco_20200206-86ba88d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_32x4d_fpn_1x_coco/ms_rcnn_x101_32x4d_fpn_1x_coco_20200206-81fd1740.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_2x_coco/ms_rcnn_r50_caffe_fpn_2x_coco_bbox_mAP-0.388__segm_mAP-0.363_20200506_004738-ee87b137.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_1x_coco/ms_rcnn_r50_caffe_fpn_1x_coco_20200702_180848-61c9355e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_2x_coco/ms_rcnn_r101_caffe_fpn_2x_coco_bbox_mAP-0.411__segm_mAP-0.381_20200506_011134-5f3cc74f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_1x_coco/ms_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.404__segm_mAP-0.376_20200506_004755-b9b12a37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ms_rcnn/metafile.yml | https://arxiv.org/abs/1903.00241 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/nas_fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200520-1bdba3ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/nas_fcos/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200521-7fdcbce0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/nas_fcos/metafile.yml | https://arxiv.org/abs/1906.04423 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/nas_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_nasfpn_crop640_50e_coco/retinanet_r50_nasfpn_crop640_50e_coco-0ad1f644.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/nas_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_fpn_crop640_50e_coco/retinanet_r50_fpn_crop640_50e_coco-9b953d76.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/nas_fpn/metafile.yml | https://arxiv.org/abs/1904.07392 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/ssd300_32x8_36e_openimages/ssd300_32x8_36e_openimages_20211224_000232-dce93846.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/retinanet_r50_fpn_32x2_1x_openimages/retinanet_r50_fpn_32x2_1x_openimages_20211223_071954-d2ae5462.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_challenge/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_challenge_20220221_192021-34c402d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages/faster_rcnn_r50_fpn_32x2_cas_1x_openimages_20220306_202424-98c630e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_1x_openimages_challenge/faster_rcnn_r50_fpn_32x2_1x_openimages_challenge_20220114_045100-0e79e5df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/openimages/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/openimages/faster_rcnn_r50_fpn_32x2_1x_openimages/faster_rcnn_r50_fpn_32x2_1x_openimages_20211130_231159-e87ab7ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://arxiv.org/abs/2007.08103 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_mstrain_3x_coco/paa_r50_fpn_mstrain_3x_coco_20210121_145722-06a6880b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_2x_coco/paa_r50_fpn_2x_coco_20200821-c98bfc4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1.5x_coco/paa_r50_fpn_1.5x_coco_20200823-805d6078.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_mstrain_3x_coco/paa_r101_fpn_mstrain_3x_coco_20210122_084202-83250d22.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_2x_coco/paa_r101_fpn_2x_coco_20200821-6829f96b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/paa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pafpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pafpn/faster_rcnn_r50_pafpn_1x_coco/faster_rcnn_r50_pafpn_1x_coco_bbox_mAP-0.375_20200503_105836-b7b4b9bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pafpn/metafile.yml | https://arxiv.org/abs/1803.01534 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://arxiv.org/pdf/1901.02446 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r50_fpn_mstrain_3x_coco/panoptic_fpn_r50_fpn_mstrain_3x_coco_20210824_171155-5650f98b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r50_fpn_1x_coco/panoptic_fpn_r50_fpn_1x_coco_20210821_101153-9668fd13.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r101_fpn_mstrain_3x_coco/panoptic_fpn_r101_fpn_mstrain_3x_coco_20210823_114712-9c99acc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/panoptic_fpn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/panoptic_fpn/panoptic_fpn_r101_fpn_1x_coco/panoptic_fpn_r101_fpn_1x_coco_20210820_193950-ab9157a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd512_coco/pisa_ssd512_coco-247addee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd300_coco/pisa_ssd300_coco-710e3ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_x101_32x4d_fpn_1x_coco/pisa_retinanet_x101_32x4d_fpn_1x_coco-a0c13c73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_r50_fpn_1x_coco/pisa_retinanet_r50_fpn_1x_coco-76409952.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_mask_rcnn_r50_fpn_1x_coco/pisa_mask_rcnn_r50_fpn_1x_coco-dfcedba6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco-e4accec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_r50_fpn_1x_coco/pisa_faster_rcnn_r50_fpn_1x_coco-dea93523.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pisa/metafile.yml | https://arxiv.org/abs/1904.04821 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_3x_coco/point_rend_r50_caffe_fpn_mstrain_3x_coco-e0ebb6b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/point_rend/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_1x_coco/point_rend_r50_caffe_fpn_mstrain_1x_coco-1bcb5fb4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/point_rend/metafile.yml | https://arxiv.org/abs/1912.08193 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2106.13797 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://arxiv.org/abs/2102.12122 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b5_fpn_1x_coco/retinanet_pvtv2-b5_fpn_1x_coco_20210902_201800-3420eb57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b4_fpn_1x_coco/retinanet_pvtv2-b4_fpn_1x_coco_20210901_170151-83795c86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b3_fpn_1x_coco/retinanet_pvtv2-b3_fpn_1x_coco_20210903_151512-8357deff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b2_fpn_1x_coco/retinanet_pvtv2-b2_fpn_1x_coco_20210901_174843-529f0b9a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b1_fpn_1x_coco/retinanet_pvtv2-b1_fpn_1x_coco_20210831_103318-7e169a7d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvtv2-b0_fpn_1x_coco/retinanet_pvtv2-b0_fpn_1x_coco_20210831_103157-13e9aabe.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-t_fpn_1x_coco/retinanet_pvt-t_fpn_1x_coco_20210831_103110-17b566bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-s_fpn_1x_coco/retinanet_pvt-s_fpn_1x_coco_20210906_142921-b6c94a5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/pvt/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/pvt/retinanet_pvt-m_fpn_1x_coco/retinanet_pvt-m_fpn_1x_coco_20210831_103243-55effa1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_mstrain_480-800_3x_coco/queryinst_r50_fpn_mstrain_480-800_3x_coco_20210901_103643-7837af86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/queryinst_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20210904_101802-85cffbd8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r50_fpn_1x_coco/queryinst_r50_fpn_1x_coco_20210907_084916-5a8f1998.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r101_fpn_mstrain_480-800_3x_coco/queryinst_r101_fpn_mstrain_480-800_3x_coco_20210904_104048-91f9995b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/queryinst/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/queryinst/queryinst_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/queryinst_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20210904_153621-76cce59f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/queryinst/metafile.yml | https://openaccess.thecvf.com/content/ICCV2021/papers/Fang_Instances_As_Queries_ICCV_2021_paper.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://arxiv.org/abs/2003.13678 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-800MF_fpn_1x_coco/retinanet_regnetx-800MF_fpn_1x_coco_20200517_191403-f6f91d10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-3.2GF_fpn_1x_coco/retinanet_regnetx-3.2GF_fpn_1x_coco_20200520_163141-cb1509e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-1.6GF_fpn_1x_coco/retinanet_regnetx-1.6GF_fpn_1x_coco_20200517_191403-37009a9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-8GF_fpn_1x_coco/mask_rcnn_regnetx-8GF_fpn_1x_coco_20200517_180515-09daa87e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-800MF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-800MF_fpn_mstrain-poly_3x_coco_20210602_210641-715d51f5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco/mask_rcnn_regnetx-6.4GF_fpn_1x_coco_20200517_180439-3a7aae83.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-4GF_fpn_mstrain-poly_3x_coco_20210602_032621-00f0331c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_1x_coco/mask_rcnn_regnetx-4GF_fpn_1x_coco_20200517_180217-32e9c92d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-400MF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-400MF_fpn_mstrain-poly_3x_coco_20210601_235443-8aac57a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco_20200521_202221-99879813.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco_20200521_202221-99879813.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco_20200520_172726-75f40794.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_1x_coco_20200520_163141-2a9d1814.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-12GF_fpn_1x_coco/mask_rcnn_regnetx-12GF_fpn_1x_coco_20200517_180552-b538bd8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-1.6GF_fpn_mstrain-poly_3x_coco/mask_rcnn_regnetx-1_20210602_210641-6764cff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-800MF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-800MF_fpn_mstrain_3x_coco_20210526_095118-a2c70b20.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-4GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-4GF_fpn_mstrain_3x_coco_20210526_095201-65eaf841.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-400MF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-400MF_fpn_mstrain_3x_coco_20210526_095112-e1967c37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-3_20210526_095152-e16a5227.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_2x_coco/faster_rcnn_regnetx-3.2GF_fpn_2x_coco_20200520_223955-e2081918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_1x_coco/faster_rcnn_regnetx-3.2GF_fpn_1x_coco_20200517_175927-126fd9bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-1.6GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-1_20210526_095325-94aa46cc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-800MF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-800MF_fpn_mstrain_3x_coco_20210715_211616-dcbd13f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-4GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-4GF_fpn_mstrain_3x_coco_20210715_212034-cbb1be4c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-400MF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-400MF_fpn_mstrain_3x_coco_20210715_211619-5142f449.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-3_20210715_211616-b9c2c58b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/regnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/regnet/cascade_mask_rcnn_regnetx-1.6GF_fpn_mstrain_3x_coco/cascade_mask_rcnn_regnetx-1_20210715_211616-75f29a61.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-f87da1ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco_20200329-91babaa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco_20200329_145952-3e51b550.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_1x_coco/reppoints_moment_r50_fpn_1x_coco_20200330-b73db8d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco_20200329-4fbc7310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-3309fbf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco_20200329_145916-0eedf8d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco_20200329_145916-0eedf8d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/reppoints/metafile.yml | https://arxiv.org/abs/1904.11490 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/mask_rcnn_r2_101_fpn_2x_coco/mask_rcnn_r2_101_fpn_2x_coco-17f061e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/htc_r2_101_fpn_20e_coco/htc_r2_101_fpn_20e_coco-3a8d2112.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/faster_rcnn_r2_101_fpn_2x_coco/faster_rcnn_r2_101_fpn_2x_coco-175f1da6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_rcnn_r2_101_fpn_20e_coco/cascade_rcnn_r2_101_fpn_20e_coco-f4b7b7db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_mask_rcnn_r2_101_fpn_20e_coco/cascade_mask_rcnn_r2_101_fpn_20e_coco-8a7b41e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/res2net/metafile.yml | https://arxiv.org/abs/1904.01169 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://arxiv.org/abs/2004.08955 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20200926_125503-8a2c3d47.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_215831-af60cdf9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20200926_125502-20289c16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201006_021058-421517f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201122_213640-763cc7b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201005_113242-b9459f8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201122_104428-99eca4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnest/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_113243-42607475.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://arxiv.org/abs/2110.00476 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco/retinanet_r50_fpn_rsb-pretrain_1x_coco_20220113_175432-bd24aae9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco/mask_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_174054-06ce8ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco/faster_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_162229-32ae82a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/resnet_strikes_back/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco/cascade_mask_rcnn_r50_fpn_rsb-pretrain_1x_coco_20220113_193636-8b9ad50f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/resnet_strikes_back/retinanet_r50_fpn_rsb-pretrain_1x_coco.py | https://download.openmmlab.com/mmclassification/v0/resnet/resnet50_8xb256-rsb-a1-600e_in1k_20211228-20e21305.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_mstrain_3x_coco/retinanet_x101_64x4d_fpn_mstrain_3x_coco_20210719_051838-022c2187.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_2x_coco/retinanet_x101_64x4d_fpn_2x_coco_20200131-bca068ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_1x_coco/retinanet_x101_64x4d_fpn_1x_coco_20200130-366f5af1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_2x_coco/retinanet_x101_32x4d_fpn_2x_coco_20200131-237fc5e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_1x_coco/retinanet_x101_32x4d_fpn_1x_coco_20200130-5c8b7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_mstrain_3x_coco/retinanet_r50_fpn_mstrain_3x_coco_20210718_220633-88476508.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_2x_coco/retinanet_r50_fpn_2x_coco_20200131-fdb43119.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_1x_coco/retinanet_r50_fpn_1x_coco_20200130-c2398f9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_caffe_fpn_1x_coco/retinanet_r50_caffe_fpn_1x_coco_20200531-f11027c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x8_1x_coco/retinanet_r18_fpn_1x8_1x_coco_20220407_171255-4ea310d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r18_fpn_1x_coco/retinanet_r18_fpn_1x_coco_20220407_171055-614fd399.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_mstrain_3x_coco/retinanet_r101_fpn_mstrain_3x_coco_20210720_214650-7ee888e0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_2x_coco/retinanet_r101_fpn_2x_coco_20200131-5560aee8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_1x_coco/retinanet_r101_fpn_1x_coco_20200130-7a93545f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_mstrain_3x_coco/retinanet_r101_caffe_fpn_mstrain_3x_coco_20210721_063439-88a8a944.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_1x_coco/retinanet_r101_caffe_fpn_1x_coco_20200531-b428fa0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/fp16/retinanet_r50_fpn_fp16_1x_coco/retinanet_r50_fpn_fp16_1x_coco_20200702-0dbfb212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/retinanet/metafile.yml | https://arxiv.org/abs/1708.02002 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_gn_1x_coco/sabl_retinanet_r50_fpn_gn_1x_coco-e16dfcf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_1x_coco/sabl_retinanet_r50_fpn_1x_coco-6c54fd4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco-1e63382c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco-5342f857.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_1x_coco/sabl_retinanet_r101_fpn_gn_1x_coco-40a893e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_1x_coco/sabl_retinanet_r101_fpn_1x_coco-42026904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r50_fpn_1x_coco/sabl_faster_rcnn_r50_fpn_1x_coco-e867595b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r101_fpn_1x_coco/sabl_faster_rcnn_r101_fpn_1x_coco-f804c6c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r50_fpn_1x_coco/sabl_cascade_rcnn_r50_fpn_1x_coco-e1748e5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r101_fpn_1x_coco/sabl_cascade_rcnn_r101_fpn_1x_coco-2b83e87c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sabl/metafile.yml | https://arxiv.org/abs/1912.04260 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scnet/metafile.yml | https://arxiv.org/abs/2012.10150 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_x101_64x4d_fpn_20e_coco/scnet_x101_64x4d_fpn_20e_coco-fb09dec9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r50_fpn_20e_coco/scnet_r50_fpn_20e_coco-a569f645.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r50_fpn_1x_coco/scnet_r50_fpn_1x_coco-c3f09857.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scnet/scnet_r101_fpn_20e_coco/scnet_r101_fpn_20e_coco-294e312c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scratch/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scratch/mask_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_mask_rcnn_r50_fpn_gn_6x_bbox_mAP-0.412__segm_mAP-0.374_20200201_193051-1e190a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scratch/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/scratch/faster_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_faster_rcnn_r50_fpn_gn_6x_bbox_mAP-0.407_20200201_193013-90813d01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/scratch/metafile.yml | https://arxiv.org/abs/1811.08883 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://arxiv.org/abs/2008.10032 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-cd0f6a12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-392a804b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-a1c11314.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r50_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-a698dd3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-1d817139.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-e68eb464.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-a0b59c42.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/mask_rcnn_r101_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-8e6e6dd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-c8551505.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_sample1e-3_seesaw_loss_mstrain_2x_lvis_v1-5d8ca2a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_random_seesaw_loss_normed_mask_mstrain_2x_lvis_v1-8b5a6745.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/seesaw_loss/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/seesaw_loss/cascade_mask_rcnn_r101_fpn_random_seesaw_loss_mstrain_2x_lvis_v1-71e2215e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://arxiv.org/abs/2012.07177 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_90k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_90k_coco_20220316_181307-6bc5726f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_scp_32x2_270k_coco_20220324_201229-80ee90b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_90k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_90k_coco_20220316_181409-f79c84c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/simple_copy_paste/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/simple_copy_paste/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco/mask_rcnn_r50_fpn_syncbn-all_rpn-2conv_ssj_32x2_270k_coco_20220324_182940-33a100c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/solo_r50_fpn_3x_coco/solo_r50_fpn_3x_coco_20210901_012353-11d224d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/solo_r50_fpn_1x_coco/solo_r50_fpn_1x_coco_20210821_035055-2290a6b8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_r50_fpn_3x_coco/decoupled_solo_r50_fpn_3x_coco_20210821_042504-7b3301ec.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_r50_fpn_1x_coco/decoupled_solo_r50_fpn_1x_coco_20210820_233348-6337c589.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solo/decoupled_solo_light_r50_fpn_3x_coco/decoupled_solo_light_r50_fpn_3x_coco_20210906_142703-e70e226f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_x101_dcn_fpn_3x_coco/solov2_x101_dcn_fpn_3x_coco_20220513_214337-aef41095.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r50_fpn_3x_coco/solov2_r50_fpn_3x_coco_20220512_125856-fed092d4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r50_fpn_1x_coco/solov2_r50_fpn_1x_coco_20220512_125858-a357fa23.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r101_fpn_3x_coco/solov2_r101_fpn_3x_coco_20220511_095119-c559a076.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_r101_dcn_fpn_3x_coco/solov2_r101_dcn_fpn_3x_coco_20220513_214734-16c966cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r50_fpn_3x_coco/solov2_light_r50_fpn_3x_coco_20220512_165256-c93a6074.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r34_fpn_3x_coco/solov2_light_r34_fpn_3x_coco_20220511_091839-e51659d3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/solov2/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/solov2/solov2_light_r18_fpn_3x_coco/solov2_light_r18_fpn_3x_coco_20220511_083717-75fa355b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://arxiv.org/abs/2011.12450 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco_20201218_154234-7bc5c054.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_024605-9fe92701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_1x_coco/sparse_rcnn_r50_fpn_1x_coco_20201222_214453-dc79b137.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco_20201223_121552-6c46c9d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_023452-c23c3564.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssdlite_mobilenetv2_scratch_600e_coco/ssdlite_mobilenetv2_scratch_600e_coco_20210629_110627-974d9307.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssd512_coco/ssd512_coco_20210803_022849-0a47a1ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ssd/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/ssd/ssd300_coco/ssd300_coco_20210803_015428-d231a06e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/ssd/metafile.yml | https://arxiv.org/abs/1512.02325 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/swin/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco/mask_rcnn_swin-t-p4-w7_fpn_ms-crop-3x_coco_20210906_131725-bacf6f7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_fp16_ms-crop-3x_coco/mask_rcnn_swin-t-p4-w7_fpn_fp16_ms-crop-3x_coco_20210908_165006-90a4008c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-t-p4-w7_fpn_1x_coco/mask_rcnn_swin-t-p4-w7_fpn_1x_coco_20210902_120937-9d6b7cfa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/swin/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/swin/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco/mask_rcnn_swin-s-p4-w7_fpn_fp16_ms-crop-3x_coco_20210903_104808-b92c91f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://arxiv.org/abs/2108.07755 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_x101_64x4d_fpn_mstrain_2x_coco/tood_x101_64x4d_fpn_mstrain_2x_coco_20211211_003519-a4f36113.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_mstrain_2x_coco/tood_r50_fpn_mstrain_2x_coco_20211210_144231-3b23174c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_anchor_based_1x_coco/tood_r50_fpn_anchor_based_1x_coco_20211214_100105-b776c134.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r50_fpn_1x_coco/tood_r50_fpn_1x_coco_20211210_103425-20e20746.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r101_fpn_mstrain_2x_coco/tood_r101_fpn_mstrain_2x_coco_20211210_144232-a18f53c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tood/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tood/tood_r101_fpn_dconv_c3-c5_mstrain_2x_coco/tood_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20211210_213728-4a824142.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_3x_coco/tridentnet_r50_caffe_mstrain_3x_coco_20201130_100539-46d227ba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_1x_coco/tridentnet_r50_caffe_mstrain_1x_coco_20201230_141839-6ce55ccb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_1x_coco/tridentnet_r50_caffe_1x_coco_20201230_141838-2ec0b530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/tridentnet/metafile.yml | https://arxiv.org/abs/1901.01892 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://arxiv.org/abs/2008.13367 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-b5f6da5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-d300a6fc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mstrain_2x_coco/vfnet_r50_fpn_mstrain_2x_coco_20201027-7cc75bd2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-6879c318.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_1x_coco/vfnet_r50_fpn_1x_coco_20201027-38db6f58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mstrain_2x_coco/vfnet_r101_fpn_mstrain_2x_coco_20201027pth-4a5d53f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-7729adb5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/vfnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_1x_coco/vfnet_r101_fpn_1x_coco_20201027pth-c831ece7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r50_8x8_coco/yolact_r50_8x8_coco_20200908-ca34f5db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r50_1x8_coco/yolact_r50_1x8_coco_20200908-f38d58df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolact/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolact/yolact_r101_1x8_coco/yolact_r101_1x8_coco_20200908-4cbe9101.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolact/metafile.yml | https://arxiv.org/abs/1904.02689 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_mobilenetv2_mstrain-416_300e_coco/yolov3_mobilenetv2_mstrain-416_300e_coco_20210718_010823-f68a07b3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_mobilenetv2_320_300e_coco/yolov3_mobilenetv2_320_300e_coco_20210719_215349-d18dff72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-608_273e_coco/yolov3_d53_mstrain-608_273e_coco_20210518_115020-a2c3acb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-416_273e_coco/yolov3_d53_mstrain-416_273e_coco-2b60fcd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_fp16_mstrain-608_273e_coco/yolov3_d53_fp16_mstrain-608_273e_coco_20210517_213542-4bc34944.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_320_273e_coco/yolov3_d53_320_273e_coco-421362b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolo/metafile.yml | https://arxiv.org/abs/1804.02767 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolof/metafile.yml | https://arxiv.org/abs/2103.09460 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolof/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427-8e864411.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolox/metafile.yml | https://arxiv.org/abs/2107.08430 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_x_8x8_300e_coco/yolox_x_8x8_300e_coco_20211126_140254-1ef88d67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_tiny_8x8_300e_coco/yolox_tiny_8x8_300e_coco_20211124_171234-b4047906.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_s_8x8_300e_coco/yolox_s_8x8_300e_coco_20211121_095711-4592a793.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/configs/yolox/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/yolox/yolox_l_8x8_300e_coco/yolox_l_8x8_300e_coco_20211126_140236-d3bd2b23.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/docker/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/machine-learning/repos/ubuntu1804/x86_64/7fa2af80.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/docker/Dockerfile | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu${CUDA//./}/torch${PYTORCH}/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/mmdet/models/detectors/two_stage.py | https://mmdetection.readthedocs.io/en/latest/tutorials/pytorch2onnx.html#list-of-supported-models-exportable-to-onnx | 相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | https://s3-us-west-2.amazonaws.com/dl.fbaipublicfiles.com/LVIS/lvis_v1_train.json.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | https://s3-us-west-2.amazonaws.com/dl.fbaipublicfiles.com/LVIS/lvis_v1_train.json.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/val2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/train2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/zips/test2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtest_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCdevkit_08-Jun-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/detection/YOLOX_ID2833_for_PyTorch/tools/misc/download_dataset.py | http://images.cocodataset.org/annotations/ | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..4f79427284cbe98ff74fcf64a7b0af9e6dfe9241 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/public_address_statement.md @@ -0,0 +1,13 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_cub.sh | https://onedrive.live.com/download?cid=B7111B95B80CCC66&resid=B7111B95B80CCC66%2130812&authkey=AFMzb4akufUiWU0 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_evaluation_data.sh | https://s3-us-west-2.amazonaws.com/imagenetv2public/imagenetv2-threshold0.7.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_evaluation_data.sh | http://www.vision.caltech.edu/visipedia-data/CUB-200-2011/CUB_200_2011.tgz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_evaluation_data.sh | https://onedrive.live.com/download?cid=B7111B95B80CCC66&resid=B7111B95B80CCC66%2130826&authkey=AJI0Cb0e74L3s00 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_evaluation_data.sh | https://onedrive.live.com/download?cid=B7111B95B80CCC66&resid=B7111B95B80CCC66%2130812&authkey=AFMzb4akufUiWU0 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_imagenet.sh | https://s3-us-west-2.amazonaws.com/imagenetv2public/imagenetv2-threshold0.7.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_openimages.sh | https://onedrive.live.com/download?cid=B7111B95B80CCC66&resid=B7111B95B80CCC66%2130813&authkey=AHgXVPxKxO_5Fvc | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/dataset/prepare_openimages.sh | https://onedrive.live.com/download?cid=B7111B95B80CCC66&resid=B7111B95B80CCC66%2130811&authkey=AMWbBWZVQFbm4jw | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/wsol/inception.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/wsol/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ADLayer_ID1087_for_PyTorch/wsol/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ATS_ID2682_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ATS_ID2682_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..5992352cdfd69c3612b7465e8dbc113c431acf7d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ATS_ID2682_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------|----------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ATS_ID2682_for_PyTorch/setup.py | lucidrains@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/AdvancedEast_ID0473_for_PyTorch/README_ori.md b/PyTorch/dev/cv/image_classification/AdvancedEast_ID0473_for_PyTorch/README_ori.md index 1a712c230e09778f6ca4136d03a972ac022631e4..1d75e82f24e61cb9c949f8f996b6631c3c8f7fd9 100644 --- a/PyTorch/dev/cv/image_classification/AdvancedEast_ID0473_for_PyTorch/README_ori.md +++ b/PyTorch/dev/cv/image_classification/AdvancedEast_ID0473_for_PyTorch/README_ori.md @@ -3,7 +3,7 @@ pytorch实现AdvancedEast+mobilenetv3 # 参考https://github.com/huoyijie/AdvancedEAST # training -## tianchi ICPR dataset download 链接: https://pan.baidu.com/s/1NSyc-cHKV3IwDo6qojIrKA 密码: ye9y +## 用户自行准备数据集 ### 1.modify config params in cfg.py, see default values. ### 2.python preprocess.py, resize image to 256256,384384,512512,640640,736*736, and train respectively could speed up training process. ### 3.python label.py diff --git a/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..17432a8932ff92fcbf38e67c5684177ab65f967d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/public_address_statement.md @@ -0,0 +1,10 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/AlexNet_ID0472_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ArcFace_ID0852_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ArcFace_ID0852_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..1d67365e3d1ea9a94d36f0af486726ca65b38575 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ArcFace_ID0852_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------|--------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ArcFace_ID0852_for_PyTorch/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/public_address_statement.md index 1d8ae9a9386e67591b465fd0b8229da99cacb2f1..dc1ac4c6712d3d599bf21db44734d249a484c47f 100644 --- a/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/public_address_statement.md @@ -1,3 +1,31 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | BertBase_ID0490_for_PyTorch/requirements.txt | https://github.com/NVIDIA/dllogger | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GLUEDownloader.py | https://gist.githubusercontent.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e/raw/17b8dd0d724281ed7c3b2aeeda662b92809aadd5/download_glue_data.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-24_H-1024_A-16.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_03/multilingual_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/cased_L-24_H-1024_A-16.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_23/multi_cased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/squad/squad_download.sh | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob/ -O $v1/evaluate-v1.1.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/squad/squad_download.sh | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/ -O $v2/evaluate-v2.0.py | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/WikiDownloader.py | https://dumps.wikimedia.org/zhwiki/latest/zhwiki-latest-pages-articles.xml.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/data/WikiDownloader.py | https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s7.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/scripts/run_pretraining_npu_8p.sh b/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/scripts/run_pretraining_npu_8p.sh index e51dd826c983091b101375748437d2ddbccfef5b..c666fe4a1b02e06edee4b7652a6b464c46076662 100644 --- a/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/scripts/run_pretraining_npu_8p.sh +++ b/PyTorch/dev/cv/image_classification/BertBase_ID0490_for_PyTorch/scripts/run_pretraining_npu_8p.sh @@ -152,7 +152,7 @@ CMD+=" $INIT_CHECKPOINT" CMD+=" --do_train" CMD+=" --use_npu" CMD+=" --loss_scale=16384.0" -CMD+=" --addr=90.90.176.102" +CMD+=" --addr=x.x.x.x" # change to your address CMD+=" --json-summary ${RESULTS_DIR}/dllogger.json " CMD="python3.7 -u $CMD" diff --git a/PyTorch/dev/cv/image_classification/BigGAN_ID0522_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/BigGAN_ID0522_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..effdf801c55b377d04e33da302687c666b63999a --- /dev/null +++ b/PyTorch/dev/cv/image_classification/BigGAN_ID0522_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BigGAN_ID0522_for_PyTorch/datasets.py | http://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/BigGAN_ID0522_for_PyTorch/inception_tf13.py | http://download.tensorflow.org/models/image/imagenet/inception-2015-12-05.tgz | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/public_address_statement.md index 6ac397d11e0cad106515cbf86c462e660f5ddb49..61f73a59c5ae1b3b60a068587d6f26cb9c28e6fa 100644 --- a/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/public_address_statement.md @@ -1,227 +1,181 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_t_agc-3620981a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gc_efficientnetv2_rw_t_agc-927a0bde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21ft1k-06c35c48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21k-fd7e8abf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmlp_s16_224_raa-10536d42.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 下载预训练模型 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_32.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_16.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CSPResNeXt-50_ID1888_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/CycleGAN_ID0521_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/CycleGAN_ID0521_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..a73446c12c8a5e31dcbf7b7c13a2e2fe6e7d35f1 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/CycleGAN_ID0521_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CycleGAN_ID0521_for_PyTorch/data/get_dataset.sh | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/${FILE}.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/CycleGAN_ID0521_for_PyTorch/train.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DEEP-HEAD-POSE_ID0796_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DEEP-HEAD-POSE_ID0796_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..ea0bacc029cc94db0aaedbded42c3d8a7f0d69b8 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/DEEP-HEAD-POSE_ID0796_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DEEP-HEAD-POSE_ID0796_for_PyTorch/code/train_alexnet.py | https://download.pytorch.org/models/alexnet-owt-4df8aa71.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DEEP-HEAD-POSE_ID0796_for_PyTorch/code/train_hopenet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DEEP-HEAD-POSE_ID0796_for_PyTorch/code/train_resnet50_regression.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DGMS_ID2460_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DGMS_ID2460_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..b0cc754326281ebbb33d660cc8d539017fe03263 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/DGMS_ID2460_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------|----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DGMS_ID2460_for_PyTorch/dataloader/datasets/cifar10.py | https://www.cs.toronto.edu/~kriz/cifar-10-python.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DGMS_ID2460_for_PyTorch/dataloader/datasets/cifar10.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..6d8ca7efb75511b153cdb904f8feb7669778ab6d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/public_address_statement.md @@ -0,0 +1,22 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/eval_copy_detection.py | https://pytorch.org/docs/stable/distributed.html | 参数相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/eval_image_retrieval.py | https://dl.fbaipublicfiles.com/dino/dino_vitsmall16_googlelandmark_pretrain/dino_vitsmall16_googlelandmark_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/eval_image_retrieval.py | https://pytorch.org/docs/stable/distributed.html | 参数相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/eval_knn.py | https://pytorch.org/docs/stable/distributed.html | 参数相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/eval_linear.py | https://pytorch.org/docs/stable/distributed.html | 参数相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/eval_video_segmentation.py | https://raw.githubusercontent.com/Liusifei/UVC/master/libs/data/palette.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_small_12_p8_pretrain/dino_xcit_small_12_p8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_small_12_p16_pretrain/dino_xcit_small_12_p16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_medium_24_p8_pretrain/dino_xcit_medium_24_p8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_xcit_medium_24_p16_pretrain/dino_xcit_medium_24_p16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase8_pretrain/dino_vitbase8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_vitbase16_pretrain/dino_vitbase16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_resnet50_pretrain/dino_resnet50_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall8_pretrain/dino_deitsmall8_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/hubconf.py | https://dl.fbaipublicfiles.com/dino/dino_deitsmall16_pretrain/dino_deitsmall16_pretrain.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/main_dino.py | https://pytorch.org/docs/stable/distributed.html | 参数相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/utils.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/video_generation.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/visualize_attention.py | https://dl.fbaipublicfiles.com/dino/img.png | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DINO_Series_for_PyTorch/visualize_attention.py | https://dl.fbaipublicfiles.com/dino/ | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DIN_ID2837_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DIN_ID2837_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..14400e3ec2250e39875219e752f02763fd082f30 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/DIN_ID2837_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|-------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DIN_ID2837_for_PyTorch/environment.yml | https://conda.anaconda.org/anaconda | miniconda链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/public_address_statement.md index 0b1d56c6c5562f0013d9bc383f676d7f4ad1c143..a6cf847c222968d2a5792998223e28bc1f79bf52 100644 --- a/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/public_address_statement.md @@ -1,227 +1,181 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_t_agc-3620981a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gc_efficientnetv2_rw_t_agc-927a0bde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21ft1k-06c35c48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21k-fd7e8abf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DPN-68_ID1889_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmlp_s16_224_raa-10536d42.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 下载预训练模型 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_32.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_16.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DPN-68_ID1889_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..7241a207b3dca86dc9a54dd83c56cadeda618e82 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/public_address_statement.md @@ -0,0 +1,8 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/scripts/setup_caffemodels.sh | http://liangchiehchen.com/projects/released/deeplab_aspp_resnet101/prototxt_and_model.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/scripts/setup_cocostuff10k.sh | http://calvin.inf.ed.ac.uk/wp-content/uploads/data/cocostuffdataset/stuffthingmaps_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/scripts/setup_cocostuff164k.sh | http://images.cocodataset.org/zips/val2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/scripts/setup_cocostuff164k.sh | http://images.cocodataset.org/zips/train2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/scripts/setup_cocostuff164k.sh | http://calvin.inf.ed.ac.uk/wp-content/uploads/data/cocostuffdataset/stuffthingmaps_trainval2017.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeepLab-CRF_ID1873_for_PyTorch/scripts/setup_voc12.sh | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/public_address_statement.md index 3c65931bb1cdd1af535f6b99c35b7b11d9ba503f..d43b3c3ddf2dd268fb41def73fd306cf59604d56 100644 --- a/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/public_address_statement.md @@ -1,184 +1,128 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https:h//dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models | DeiT_ID1558_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载预训练模型 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_24_dino.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/resmlp_models.py | http//dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/DeiT_ID1558_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/EfficientNet-B6_ID1715_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/EfficientNet-B6_ID1715_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..94cd25a8a4e5de925c93a6cef2ec948269db4771 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/EfficientNet-B6_ID1715_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/EfficientNet-B6_ID1715_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/EfficientNet-B6_ID1715_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/EfficientNet-B6_ID1715_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/EfficientNet-B7_ID1716_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/EfficientNet-B7_ID1716_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..179905e968be3cade568d46142321193ad5dc704 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/EfficientNet-B7_ID1716_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------|-----------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/EfficientNet-B7_ID1716_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/EfficientNet-B7_ID1716_for_PyTorch/modelarts/train_start.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/EfficientNet-B7_ID1716_for_PyTorch/setup.py | lmelaskyriazi@college.harvard.edu | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..46f9105fa88c125477c7150d9c6a8d47802424de --- /dev/null +++ b/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/public_address_statement.md @@ -0,0 +1,14 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html for details | 说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 设置说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/FasterRCNN_ID0100_for_PyTorch/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/public_address_statement.md index 20c15b50cf6a0cc5a466b5388bdc63f25cfba60a..e287762a65cf12dbe0825b275920e57f51d49de2 100644 --- a/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/public_address_statement.md @@ -1,161 +1,77 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/vit_small_p16_224-15ec54c9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_224-80ecf9dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p16_384-83fb41ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_p32_384-830016f5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_224-4ee7a4dc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p16_384-b3be5167.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_p32_384-9b920ba8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch16_224_in21k-e5005f0a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_patch32_224_in21k-8db57226.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch16_224_in21k-606da67d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_large_patch32_224_in21k-9046d2e7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_224_in21k-6f7c7740.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_vit_base_resnet50_384-9fd3c705.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载预训练模型 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/GhostNet_ID1622_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/InceptionV3_ID0445_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/InceptionV3_ID0445_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..67bb541a2617e754f055a1145d6f173981020dd1 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/InceptionV3_ID0445_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|----------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/InceptionV3_ID0445_for_PyTorch/inception.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/InceptionV4_ID0444_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/InceptionV4_ID0444_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..2604232fec845bcff86fc84936de18ab605df1a5 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/InceptionV4_ID0444_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|----------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/InceptionV4_ID0444_for_PyTorch/common.py | https://arxiv.org/abs/1709.01507 | 论文地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Keyword-MLP_ID2441_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Keyword-MLP_ID2441_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..b3da4ea1c27e17d8ee495b2d9a2437359592f089 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Keyword-MLP_ID2441_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Keyword-MLP_ID2441_for_PyTorch/download_gspeech_v2.sh | http://download.tensorflow.org/data/speech_commands_v0.02.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/LADE_ID2445_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/LADE_ID2445_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..7c508e2fdc20c0fb3e15ed77332bc4a1ee90204a --- /dev/null +++ b/PyTorch/dev/cv/image_classification/LADE_ID2445_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/LADE_ID2445_for_PyTorch/data/ImbalanceCIFAR.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..bfa5bd7287e80dbdce5242470f5942b11f31423d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/public_address_statement.md @@ -0,0 +1,15 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/envconfig/env_list.yml | https://mirrors.ustc.edu.cn/anaconda/pkgs/main/ | 镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/envconfig/env_list.yml | https://mirrors.ustc.edu.cn/anaconda/pkgs/main | 镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/envconfig/env_list.yml | https://mirrors.ustc.edu.cn/anaconda/cloud/conda-forge/ | 镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/envconfig/env_list.yml | https://mirrors.tuna.tsinghua.edu.cn/anaconda/pkgs/free/ | 镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MMAL-NET_ID1116_for_PyTorch/networks/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/public_address_statement.md index 6f59cb51936e4c564afb82f1238be4231b65b3d2..3d781e268b5ce9f6603b05ac2c57b45ffbcc0b56 100644 --- a/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/public_address_statement.md @@ -1,227 +1,182 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_b1-74cb7081.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mnasnet_a1-d9418771.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_100_ra-b33bc2c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_110d_ra-77090ade.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_120d_ra-5987e2ed.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mobilenetv2_140_ra-21a4e913.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/fbnetc_100-c345b898.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/spnasnet_100-048bc3f4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b0_ra-3dd342df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b1-533bc792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b2_ra-bcdf34b7.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b3_ra2-cf984f9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_b4_ra2_320-7eb33cd5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_es_ra-f111e99c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_em_ra2-66250f76.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_es_pruned75.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/DeGirum/pruned-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/DeGirum/pruned-models/releases/download/efficientnet_v1.0/efficientnet_el_pruned70.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_lite0_ra-37913777.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_t_agc-3620981a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gc_efficientnetv2_rw_t_agc-927a0bde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnet_v2s_ra2_288-a6477665.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/efficientnetv2_rw_m_agc-3d90cb1e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_aa-827b6e33.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_aa-ea7a6ee0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_aa-60c94f97.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_aa-84b4657e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_aa-818f208c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ra-9a3e5369.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_aa-80ba17e4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ra-6c08e654.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ra-572d5dd9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ap-f262efe1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ap-44ef0a3d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ap-2f8e7636.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ap-aad25bdd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ap-dedb23e6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ap-9e82fae8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ap-4ffb161f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ap-ddb28fec.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b8_ap-00e169fa.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b0_ns-c0e6a31c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b1_ns-99dd0c41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b2_ns-00306e48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b3_ns-9d44bf68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b4_ns-d6313a46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b5_ns-6f26d0cf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b6_ns-51548356.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_b7_ns-1dbc32de.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns_475-bebbd00a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_l2_ns-df73bb44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_es-ca1afbfe.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_em-e78cfe58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_el-5143854e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_4e-4362b6b2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b0_8e-66184a25.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_cc_b1_8e-f7c79ae1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite0-0aa007d2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite1-bde8b488.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite2-dcccb7df.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite3-b733e338.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_efficientnet_lite4-741542c3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s-eb54923e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m-cc09e0cd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l-d664b728.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21ft1k-d7dafa41.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21ft1k-bf41664a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21ft1k-60127a9d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21ft1k-06c35c48.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_s_21k-6337ad01.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_m_21k-361418a2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_l_21k-91a19ec9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_xl_in21k-fd7e8abf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b0-c7cc451f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b1-be6e41b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b2-847de54e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-effv2-weights/tf_efficientnetv2_b3-57773f13.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_s-a907afbc.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_m-4647fc68.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_l-5a9a2ed8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/mixnet_xl_ra-aac3c00c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_s-89d3354b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_m-0f4d8805.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/tf_mixnet_l-6c92e0c8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet18_v1b-0757602b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet34_v1b-c6d82d59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1b-0ebe02e2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1b-3b017079.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1b-c1edb0dd.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1c-48092f55.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1c-1f26822a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1c-a3bb0b98.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1d-818a1b1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1d-0f9c8644.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1d-bd354e12.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet50_v1s-1762acc0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet101_v1s-60fe0cc1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnet152_v1s-dcc41b81.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext50_32x4d-e6a097c1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_32x4d-b253c8c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_resnext101_64x4d-f9a8e184.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext50_32x4d-90cf2d6e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_32x4d-cf52900d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_seresnext101_64x4d-f9926f93.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/gluon_resnet.py | https://github.com/rwightman/pytorch-pretrained-gluonresnet/releases/download/v0.1/gluon_senet154-70a1a3c0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224-76587d61.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_b16_224_in21k-617b3de2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224-92f9adc4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-vitjx/jx_mixer_l16_224_in21k-846aa33c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmixer_24_224_raa-7daf7ae6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-pretrained-gluonresnet | MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py| https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/gmlp_s16_224_raa-10536d42.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet18d_ra2-48a79e06.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34-43635321.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet34d_ra2-f8dcfcaf.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26-9aa10e23.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet26d-69e92c46.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50_ram-a26f946b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet50d_ra2-464e36ba.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet101d_ra2-2803ffab.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet152d_ra2-5cac0439.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnet200d_ra2-bdba9bf9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/wide_resnet50_racm-8234f177.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50_32x4d_ra-d733960d.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnext50d_32x4d-103e99f8.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet50_ra_224-8efdb4bb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnet152d_ra2-04464dd2.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26d_32x4d-80fa48a3.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext26tn_32x4d-569cb627.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/seresnext50_32x4d_racm-a304a460.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet26t_ra2-46609757.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet50t_ra2-f7ac63c4.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/ecaresnet269d_320_ra2-7baa55cb.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-weights/resnetblur50-84f4748f.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs50_ema-6b53758b.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs101_i192_ema-1509bbf6.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs152_i256_ema-a9aff7f9.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs200_ema-623d2f59.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs270_ema-b40e674c.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs350_i256_ema-5a1aa8f1.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://github.com/rwightman/pytorch-image-models/releases/download/v0.1-rs-weights/resnetrs420_ema-972dee69.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 下载预训练模型 | -| 开源代码引入 | https://github.com/rwightman/pytorch-image-models.git | MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 下载预训练模型 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/.github/workflows/tests.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/tablesort/5.2.1/tablesort.min.js | 开源引用说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/mkdocs.yml | https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.0/MathJax.js?config=TeX-MML-AM_CHTML | 开源引用说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/setup.py | hello@rwightman.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/cait.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_tiny.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_small.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/convit.py | https://dl.fbaipublicfiles.com/convit/convit_base.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet121-a639ec97.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet201-c1103571.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet169-b2777c0a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/densenet.py | https://download.pytorch.org/models/densenet161-8d451a50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x_c-b870c45c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60x-d15cacda.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla60-24839fc4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46x_c-d761bae7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla46_c-2bfd52c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla34-ba72cf86.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla169-0914e092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x2-262837b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102x-ad62be81.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/dla.py | http://dl.yf.io/dla/models/imagenet/dla102-d94d9790.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb3_pruned_5abcc29f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb2_pruned_203f55bc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/efficientnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45403/outputs/effnetb1_pruned_9ebb3fe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_F_Green_60ms_78.1_2855edf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_E_Green_55ms_77.9_90f20e8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_D_Green_50ms_77.4_23e3cdde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_C_Green_44ms_77.1_d4148c9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_B_Green_40ms_76.5_1f882d1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/hardcorenas.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/public/HardCoReNAS/HardCoreNAS_A_Green_38ms_75.9_23474aeb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/inception_v3.py | https://download.pytorch.org/models/inception_v3_google-1a9a5a14.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-384-9bdaf2e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-256-13b5763e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-192-92712e41.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128S-96703c44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/levit.py | https://dl.fbaipublicfiles.com/LeViT/LeViT-128-b88c2750.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mixer_b16_224_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mlp_mixer.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/mobilenetv3.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/mobilenetv3_large_100_1k_miil_78_0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/regnet.py | https://dl.fbaipublicfiles.com/deit/regnety_160-a5fe301d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x8-c38310e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x48-3e41cc8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x32-e4b90b00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://download.pytorch.org/models/ig_resnext101_32x16-c6f796b0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45899/outputs/ECAResNet50D_P_9c67f710.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45610/outputs/ECAResNet101D_P_75a3370e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNetLight_4f34b35b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet50D_833caf58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://imvl-automl-sh.oss-cn-shanghai.aliyuncs.com/darts/hyperml/hyperml/job_45402/outputs/ECAResNet101D_281c5844.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext50_32x4-72679e44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x8-b4712904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x4-3f87e46b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnext101_32x16-f3559a9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet50-16a12f1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_weakly_supervised_resnet18-118f1556.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext50_32x4-ddb3e555.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x8-2cfe2f8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x4-dc43570a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnext101_32x16-15fffa57.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet50-08389792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnet.py | https://dl.fbaipublicfiles.com/semiweaksupervision/model_files/semi_supervised_resnet18-d92f0530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R50x1_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_384.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/distill/R152x2_T_224.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R50x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x4.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R152x2.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x3.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1-ILSVRC2012.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/resnetv2.py | https://storage.googleapis.com/bit_models/BiT-M-R101x1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_miil_in21k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/tresnet.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/tresnet_m_1k_miil_83_1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm/vit_base_patch16_224_in21k_miil.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_32.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/sam/ViT-B_16.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/imagenet21k/ViT-H_14.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_32-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/S_16-i21k-300ep-lr_0.001-aug_light1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/L_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.1-sd_0.1.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_32-i21k-300ep-lr_0.001-aug_medium1-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/B_16-i21k-300ep-lr_0.001-aug_medium1-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer.py | https://miil-public-eu.oss-eu-central-1.aliyuncs.com/model-zoo/ImageNet_21K_P/models/timm | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R50_L_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.1-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R26_S_32-i21k-300ep-lr_0.001-aug_medium2-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/R_Ti_16-i21k-300ep-lr_0.001-aug_none-wd_0.03-do_0.0-sd_0.0.npz | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/vision_transformer_hybrid.py | https://storage.googleapis.com/vit_models/augreg/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_224.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_tiny_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_small_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_nano_12_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_medium_24_p16_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p8_384_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MNasNet_ID1723_for_PyTorch/timm/models/xcit.py | https://dl.fbaipublicfiles.com/xcit/xcit_large_24_p16_384_dist.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..854c08fe6f3c8e456db8d07e221bc2f98684ae28 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/public_address_statement.md @@ -0,0 +1,14 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html for details | 说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 设置说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/docker/Dockerfile | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/docker/Dockerfile-circleci | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/docker/Dockerfile-circleci | https://download.pytorch.org/whl/cu101/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MaskRCNN_ID0101_for_PyTorch/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Mnasnet0_75_ID0439_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Mnasnet0_75_ID0439_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..91fb225857eb1cdefb9a6cbe9b834b2912ddac6f --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Mnasnet0_75_ID0439_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mnasnet0_75_ID0439_for_PyTorch/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mnasnet0_75_ID0439_for_PyTorch/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Mnasnet1_0_ID0438_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Mnasnet1_0_ID0438_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..c8225027279fc8ec81d261cdeee94288192fea71 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Mnasnet1_0_ID0438_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mnasnet1_0_ID0438_for_PyTorch/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mnasnet1_0_ID0438_for_PyTorch/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Mnasnet1_3_ID0437_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Mnasnet1_3_ID0437_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..c5a4706db3320229a75a541c419adddcc0fc433a --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Mnasnet1_3_ID0437_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|---------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mnasnet1_3_ID0437_for_PyTorch/mnasnet.py | https://download.pytorch.org/models/mnasnet1.0_top1_73.512-f206786ef8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mnasnet1_3_ID0437_for_PyTorch/mnasnet.py | https://download.pytorch.org/models/mnasnet0.5_top1_67.592-7c6cb539b9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/MobileNetV3-Small_ID1785_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/MobileNetV3-Small_ID1785_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..bb717b340ac04a6bc9ceb05d270e5aa209ec7799 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/MobileNetV3-Small_ID1785_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MobileNetV3-Small_ID1785_for_PyTorch/mobilenetv2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MobileNetV3-Small_ID1785_for_PyTorch/mobilenetv3.py | https://download.pytorch.org/models/mobilenet_v3_small-047dcff4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/MobileNetV3-Small_ID1785_for_PyTorch/mobilenetv3.py | https://download.pytorch.org/models/mobilenet_v3_large-8738ca79.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Multigrain_ID3664_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Multigrain_ID3664_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..fad424552d13f4b86a55ac9c38cdc5904e7d434d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Multigrain_ID3664_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------|--------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Multigrain_ID3664_for_PyTorch/multigrain/datasets/retrieval.py | https://archive.org/download/ukbench/ukbench.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..92b0cb79e82d83f14fba922c0549ec9ac3b5aa4a --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/public_address_statement.md @@ -0,0 +1,7 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/CUB-200-2011_ResNet18.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/CUB-200-2011_ResNet18.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/CUB-200-2011_ResNet18.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/CUB-200-2011_ResNet18.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Mutual-Channel-Loss_ID1113_for_PyTorch/CUB-200-2011_ResNet18.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/PiT_ID2671_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/PiT_ID2671_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..37e4a2f0c39389a435baf2789759a031ae8a5192 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/PiT_ID2671_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------|----------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/PiT_ID2671_for_PyTorch/setup.py | lucidrains@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Pix2Pix_ID0331_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Pix2Pix_ID0331_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..308a999154719dc257ec0f4bf08e1ab599ff1783 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Pix2Pix_ID0331_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Pix2Pix_ID0331_for_PyTorch/data/download_cyclegan_dataset.sh | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/$FILE.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Pix2Pix_ID0331_for_PyTorch/data/download_pix2pix_dataset.sh | https://people.eecs.berkeley.edu/~tinghuiz/projects/pix2pix/datasets/$FILE.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Pysot_ID0428_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Pysot_ID0428_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..16aeaee6979a9c1657b4ac16149ee659192eae46 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Pysot_ID0428_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|---------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Pysot_ID0428_for_PyTorch/pysot/utils/distributed.py | 8.8.8.8 | ip地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/RAFT_for_PyTorch/00-access/RAFT/public_address_statement.md b/PyTorch/dev/cv/image_classification/RAFT_for_PyTorch/00-access/RAFT/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..d51df71a97ff2f7afe7b50b9f3a90388edcfd1d3 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/RAFT_for_PyTorch/00-access/RAFT/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------|-------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RAFT_for_PyTorch/00-access/RAFT/core/utils/flow_viz.py | http://vision.middlebury.edu/flow/flowEval-iccv07.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RAFT_for_PyTorch/00-access/RAFT/download_models.sh | https://www.dropbox.com/s/4j4z58wuv8o0mfz/models.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..89b3d2c1ddd1321c29c367f7c73a8278962d8211 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/public_address_statement.md @@ -0,0 +1,9 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/res2net.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_48w_2s-afed724a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/res2net.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_26w_8s-2c7c9f12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/res2net.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_26w_6s-19041792.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/res2net.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_26w_4s-06e79181.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/res2net.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net50_14w_8s-6527dddc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/RES2NET_ID0824_for_PyTorch/res2net.py | https://shanghuagao.oss-cn-beijing.aliyuncs.com/res2net/res2net101_26w_4s-02a759a1.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..faab4eba2c41a3238bebfc024a7cd06a523fba70 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/public_address_statement.md @@ -0,0 +1,12 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt101_ID1717_for_PyTorch/models/resnet_0_6_0.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..db181bbdbda82c969248fbc93a4cadb408d1a010 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNeXt50_ID0419_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ResNet50_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ResNet50_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..9039a818b2dcc12f96bcbd238bb3d6b14a0cd25b --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ResNet50_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------------|--------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet50_for_PyTorch/DistributedResnet50/main_apex_d76_npu.py | tcp://224.66.41.62:23456 | ip地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet50_for_PyTorch/pytorch_resnet50_apex.py | tcp://224.66.41.62:23456 | ip地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..91691ea7feb6652684c9f646e99f2810e4c34b81 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet101_ID0425_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..ca447c18f964fdb781cb5a35202eb457d37ae376 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/public_address_statement.md @@ -0,0 +1,20 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ResNet152_ID0424_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet152_ID1592_for_PyTorch/resnet.py | http//download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..ab0fc27e2d7b9b11c1a07a0021ca9202c5f5c97f --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet18_ID0423_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..b37544090c8e13d66e8dc8d0dfc70d1d0b37d1c1 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnet34_ID0422_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..3f30499cf4ff3d02a5c88859998a989eac66b116 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Resnext101_32x8d_ID0420_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..d51c8fe14c7de5523fe599eaf9bb795f5b48d92c --- /dev/null +++ b/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/public_address_statement.md @@ -0,0 +1,44 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/vg64_no_relations.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/vg64_no_obj_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/vg64_no_img_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/vg64_no_gconv.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/vg64_no_discriminators.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_no_relations.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_no_obj_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_no_img_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_no_gconv.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_no_discriminators.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_gt_layout_no_gconv.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_ablated_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64_gt_layout.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_coco.sh | http://images.cocodataset.org/zips/val2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_coco.sh | http://images.cocodataset.org/zips/train2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_coco.sh | http://images.cocodataset.org/annotations/stuff_annotations_trainval2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_coco.sh | http://images.cocodataset.org/annotations/annotations_trainval2017.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg64_no_relations.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg64_no_obj_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg64_no_img_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg64_no_gconv.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg64_no_discriminators.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg64.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/vg128.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_no_relations.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_no_obj_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_no_img_discriminator.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_no_gconv.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_no_discriminators.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_gt_layout_no_gconv.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64_gt_layout.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_full_models.sh | https://storage.googleapis.com/sg2im-data/full/coco64.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_models.sh | https://storage.googleapis.com/sg2im-data/small/vg64.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_models.sh | https://storage.googleapis.com/sg2im-data/small/vg128.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_models.sh | https://storage.googleapis.com/sg2im-data/small/coco64.pt -O sg2im-models/coco64.pt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://visualgenome.org/static/data/dataset/relationships.json.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://visualgenome.org/static/data/dataset/relationship_alias.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://visualgenome.org/static/data/dataset/objects.json.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://visualgenome.org/static/data/dataset/object_alias.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://visualgenome.org/static/data/dataset/image_data.json.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://visualgenome.org/static/data/dataset/attributes.json.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://cs.stanford.edu/people/rak248/VG_100K_2/images2.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SG2IM_ID0786_for_PyTorch/scripts/download_vg.sh | https://cs.stanford.edu/people/rak248/VG_100K_2/images.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/SRGAN_ID1880_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/SRGAN_ID1880_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..be4c518819d87e02396f1921525cac4ec8be41ef --- /dev/null +++ b/PyTorch/dev/cv/image_classification/SRGAN_ID1880_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SRGAN_ID1880_for_PyTorch/data/download_cyclegan_dataset.sh | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/$FILE.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SRGAN_ID1880_for_PyTorch/data/download_pix2pix_dataset.sh | https://people.eecs.berkeley.edu/~tinghuiz/projects/pix2pix/datasets/$FILE.tar.gz | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/STARGAN_ID0725_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/STARGAN_ID0725_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..0bb31e2ec0be52f23bda775e1f8404ae4ffaf06d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/STARGAN_ID0725_for_PyTorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|--------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/STARGAN_ID0725_for_PyTorch/download.sh | https://www.dropbox.com/s/zdq6roqf63m0v5f/celeba-256x256-5attrs.zip?dl=0 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/STARGAN_ID0725_for_PyTorch/download.sh | https://www.dropbox.com/s/d1kjpkqklf0uw77/celeba.zip?dl=0 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/STARGAN_ID0725_for_PyTorch/download.sh | https://www.dropbox.com/s/7e966qq0nlxwte4/celeba-128x128-5attrs.zip?dl=0 | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/public_address_statement.md index 4713ddede3dcacc741195dba7fe0e85845e33c00..6dcd5189b50d9e26eb83a084fd397e98e31aff64 100644 --- a/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/public_address_statement.md @@ -1,12 +1,120 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/detectron2/docs/requirements.txt | https://github.com/sphinx-doc/sphinx/commit/7acd3ada3f38076af7b2b5c9f3b60bb9c2587a3d | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/detectron2/docs/requirements.txt | git://github.com/facebookresearch/fvcore.git | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/detectron2/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torch-1.7.0%2Bcpu-cp37-cp37m-linux_x86_64.whl | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/detectron2/docs/requirements.txt | https://download.pytorch.org/whl/cpu/torchvision-0.8.1%2Bcpu-cp37-cp37m-linux_x86_64.whl | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/detectron2/tools/deploy/CMakeLists.txt | https://pytorch.org/tutorials/advanced/cpp_frontend.html | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/requirements.txt | https://github.com/facebookresearch/fvcore | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/requirements.txt | https://github.com/facebookresearch/fvcore.git | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/requirements.txt | https://github.com/cocodataset/cocoapi.git#subdirectory=PythonAPI | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/requirements.txt | https://github.com/facebookresearch/detectron2 | 相关依赖 | -| 开发引入 | / | SlowFast_ID0646_for_PyTorch/requirements.txt | https://github.com/facebookresearch/detectron2/blob/master/INSTALL.md | 相关说明 | +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/.circleci/config.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/.circleci/config.yml | https://download.pytorch.org/whl/cu102/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/.github/ISSUE_TEMPLATE/config.yml | https://detectron2.readthedocs.io/index.html | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/.github/workflows/workflow.yml | https://download.pytorch.org/whl/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/configs/COCO-InstanceSegmentation/mask_rcnn_regnetx_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/configs/COCO-InstanceSegmentation/mask_rcnn_regnety_4gf_dds_fpn_1x.py | https://dl.fbaipublicfiles.com/pycls/dds_baselines/160906838/RegNetY-4.0GF_dds_8gpu.pyth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/datasets/prepare_for_tests.sh | https://dl.fbaipublicfiles.com/detectron2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/datasets/prepare_panoptic_fpn.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/detectron2/data/datasets/coco.py | https://detectron2.readthedocs.io/en/latest/tutorials/datasets.html | 问题引导 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/detectron2/engine/defaults.py | https://pytorch.org/docs/stable/distributed.html for details | 说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/detectron2/evaluation/coco_evaluation.py | http://cocodataset.org/#keypoints-eval | 设置说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/detectron2/model_zoo/model_zoo.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/detectron2/utils/file_io.py | https://dl.fbaipublicfiles.com/detectron2/ | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/detectron2/utils/testing.py | http://images.cocodataset.org/train2017/000000000009.jpg | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/dev/packaging/build_wheel.sh | https://download.pytorch.org/whl/"$CU_VERSION"/torch_stable.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/dev/packaging/gen_install_table.py | https://dl.fbaipublicfiles.com/detectron2/wheels/{cuda}/torch{torch}/index.html | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/docker/Dockerfile | https://bootstrap.pypa.io/get-pip.py | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/docker/Dockerfile | https://download.pytorch.org/whl/cu111/torch_stable.html | 三方库下载 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/Base-DensePose-RCNN-FPN-Human.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_smpl_27554_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_7466_256.pkl | 数据集相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_s1x/250533982/model_final_2c4512.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_s1x/250533982/model_final_2c4512.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_5001_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_CA_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_7466_256.pkl | 数据集相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_s1x/250533982/model_final_2c4512.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_s1x/250533982/model_final_2c4512.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_5001_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_7466_256.pkl | 数据集相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_s1x/250533982/model_final_2c4512.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_7466_256.pkl | 数据集相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k/270668502/model_final_21b1d2.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_7466_256.pkl | 数据集相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k/270668502/model_final_21b1d2.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_i2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cat_7466_256.pkl | 数据集相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_animals_finetune_maskonly_24k/270668502/model_final_21b1d2.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_zebra_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_smpl_27554_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_sheep_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_horse_5004_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_giraffe_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_elephant_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_dog_7466_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_cow_5002_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_animals_I0_finetune_m2m_16k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_bear_4936_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_chimps_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/cse/densepose_rcnn_R_50_FPN_soft_s1x/250533982/model_final_2c4512.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/cse/densepose_rcnn_R_50_FPN_soft_chimps_finetune_4k.yaml | https://dl.fbaipublicfiles.com/densepose/data/cse/lbo/phi_chimp_5029_256.pkl | 相关说明文档 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_coarsesegm.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_coarsesegm.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_finesegm.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_finesegm.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_uniform.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_uniform.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_uv.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/configs/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA_B_uv.yaml | https://dl.fbaipublicfiles.com/densepose/evolution/densepose_R_50_FPN_DL_WC1M_3x_Atop10P_CA/217578784/model_final_9fe1cc.pkl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/data/datasets/coco.py | https://dl.fbaipublicfiles.com/densepose/data/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/data/meshes/builtin.py | https://dl.fbaipublicfiles.com/densepose/meshes/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/evaluation/densepose_coco_evaluation.py | https://dl.fbaipublicfiles.com/densepose/data/SMPL_SUBDIV_TRANSFORM.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/evaluation/densepose_coco_evaluation.py | https://dl.fbaipublicfiles.com/densepose/data/SMPL_subdiv.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/evaluation/densepose_coco_evaluation.py | https://dl.fbaipublicfiles.com/densepose/data/Pdist_matrix.pkl | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/evaluation/mesh_alignment_evaluator.py | https://dl.fbaipublicfiles.com/densepose/data/cse/mesh_keyvertices_v0.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/projects/DensePose/densepose/vis/densepose_outputs_vertex.py | https://dl.fbaipublicfiles.com/densepose/data/cse/mds_d=256.npy | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SlowFast_ID0646_for_PyTorch/detectron2/tools/convert-torchvision-to-d2.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..aac3c2bd100ea2c9d46fcba7cb329f9f90edd140 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/public_address_statement.md @@ -0,0 +1,13 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------|-----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/Dockerfile | nyoshida@nd.edu | 镜像作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/static/drawingboard.min.js | http://browsehappy.com/ | html相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/static/style.css | https://use.typekit.net/ymw8ood.css | 相关设置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/gallery.html | https://code.jquery.com/jquery-3.3.1.slim.min.js | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/gallery.html | https://use.typekit.net/ymw8ood.css | html相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/gallery.html | https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css | html相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/index.html | https://arxiv.org/abs/1903.07291 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/index.html | https://code.jquery.com/jquery-3.3.1.slim.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/index.html | https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/js/bootstrap.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/index.html | https://ajax.googleapis.com/ajax/libs/jquery/3.3.1/jquery.min.js | html相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SmartSketch_ID1046_for_PyTorch/backend/templates/index.html | https://stackpath.bootstrapcdn.com/bootstrap/4.3.1/css/bootstrap.min.css | html相关配置 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/SqueezeNet_ID0413_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/SqueezeNet_ID0413_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..2b9a4007b0ec58e3c05e36ca647f531b03341a23 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/SqueezeNet_ID0413_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|----------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SqueezeNet_ID0413_for_PyTorch/squeezenet.py | https://download.pytorch.org/models/squeezenet1_1-f364aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/SqueezeNet_ID0413_for_PyTorch/squeezenet.py | https://download.pytorch.org/models/squeezenet1_0-a815701f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/TADE_ID2443_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/TADE_ID2443_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..2ea3d30d0f0239c68884ce62a663bbb5b3890712 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/TADE_ID2443_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------|----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TADE_ID2443_for_PyTorch/data_loader/imbalance_cifar.py | https://www.cs.toronto.edu/~kriz/cifar-100-python.tar.gz | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/TabNet_ID2862_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/TabNet_ID2862_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..7b499576a3cb31364414ec87010cd4f14175390b --- /dev/null +++ b/PyTorch/dev/cv/image_classification/TabNet_ID2862_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|-------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TabNet_ID2862_for_PyTorch/Dockerfile | https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | ssl配置 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TabNet_ID2862_for_PyTorch/Dockerfile_gpu | https://raw.githubusercontent.com/sdispater/poetry/master/get-poetry.py | ssl配置 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Token-to-Token-ViT_ID2668_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Token-to-Token-ViT_ID2668_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..a449f73bdb4578aa15c6d9ae1dc65658b99adc29 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Token-to-Token-ViT_ID2668_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|----------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Token-to-Token-ViT_ID2668_for_PyTorch/setup.py | lucidrains@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..d53de6df78a3b4c63729033dbc67026f5d11f3b7 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/public_address_statement.md @@ -0,0 +1,8 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------|------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/getdata.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-2-v1.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/getdata.sh | http://www.statmt.org/lm-benchmark/1-billion-word-language-modeling-benchmark-r13output.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/getdata.sh | https://raw.githubusercontent.com/salesforce/awd-lstm-lm/master/data/enwik8/prep_enwik8.py | 源码实现 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/getdata.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/getdata.sh | http://mattmahoney.net/dc/text8.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/TransformerXL_ID0699_for_PyTorch/getdata.sh | http://mattmahoney.net/dc/enwik8.zip | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..45f6f9cfd7463563afd17d1d0e69a8d045baa14c --- /dev/null +++ b/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/public_address_statement.md @@ -0,0 +1,10 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------|---------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG16_ID0467_for_PyTorch/vgg.py | http//download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..5636d2ee429b0b4089638cafc1bf04f0efd9c3e8 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/public_address_statement.md @@ -0,0 +1,10 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------|-----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGG19_ID0244_for_PyTorch/vgg.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..3e58e645f28d08ce6aa4ec9d189df78c1aa00602 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/public_address_statement.md @@ -0,0 +1,13 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|---------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/examples/cifar/main.py | tcp://172.168.1.1:11111 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/setup.py | liuchangyu1111@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_ID0400_for_PyTorch/vgg_pytorch/utils.py | http//download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/public_address_statement.md index 417d40f579fb10470ec72adc4a10efd487ea55a6..3e203edffde4d9bc76091f3a8e28bba2f5a9aac3 100644 --- a/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/public_address_statement.md +++ b/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/public_address_statement.md @@ -1,4 +1,13 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | VGGNet_for_Pytorch/requirements.txt | https://github.com/NVIDIA/dllogger | 相关依赖 | -| 开发引入 | / | VGGNet_for_Pytorch/VGGNet_for_Pytorch/requirements.txt | https://github.com/NVIDIA/dllogger | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/examples/cifar/main.py | tcp://172.168.1.1:11111 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/setup.py | liuchangyu1111@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg16-397923af.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg19_bn-c79401a0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg19-dcbb9e9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg16_bn-6c64b313.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg13_bn-abd245e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg13-c768596a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg11_bn-6002323d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VGGNet_for_Pytorch/VGGNet_for_PyTorch/vgg_pytorch/utils.py | https://download.pytorch.org/models/vgg11-bbd30ac9.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/VIT_ID2381_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/VIT_ID2381_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..86640e56f3769edc046fbdbad11eb814457baf2d --- /dev/null +++ b/PyTorch/dev/cv/image_classification/VIT_ID2381_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------|----------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/VIT_ID2381_for_PyTorch/setup.py | lucidrains@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..5cdd3a1b502ab9a2787d94bc10fa4e7d006c6b6c --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet101_2_ID0398_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..45c36382837f1fdc255b22da53b9e13fab9cd80e --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/public_address_statement.md @@ -0,0 +1,11 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Wide_resnet50_2_ID0397_for_PyTorch/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/Xception_ID1454_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/Xception_ID1454_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..35855b05bc8f3925f5a6f38ba0a0d9405fab1f88 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/Xception_ID1454_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------|------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/Xception_ID1454_for_PyTorch/xception.py | http//www.dropbox.com/s/1hplpzet9d7dv29/xception-c0a72b38.pth.tar?dl=1 | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..259f321cee5427f9aba80b191471e9c184f2231a --- /dev/null +++ b/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/public_address_statement.md @@ -0,0 +1,22 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b8-22a8fe65.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b7-4652b6dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b7-dcc49843.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b6-ac80338e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b6-c76e70fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b5-86493f6b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b5-b6417697.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b4-44fb3a87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b4-6ed6700e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b3-cdd7c0f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b3-5fb5a3c3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b2-6e9d97e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b2-8bb594d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b1-0f3ce85a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b1-f1951068.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/advprop/efficientnet-b0-b64d5a18.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_EfficientDet/efficientnet/utils.py | http//publicmodels.blob.core.windows.net/container/aa/efficientnet-b0-355c32eb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_VGGNet/00-access/examples/cifar/main.py | tcp://172.168.1.1:11111 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_VGGNet/00-access/examples/imagenet/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/ZERO-DCE_ID1040_for_PyTorch/Pytorch_VGGNet/00-access/setup.py | liuchangyu1111@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..fb2731e733191ad9f74f10cbad53e8d9709c94a4 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/public_address_statement.md @@ -0,0 +1,30 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS36_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS36_224.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS24_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XXS24_224.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/XS24_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/S36_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/S24_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/S24_224.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/M48_448.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/cait_models.py | https://dl.fbaipublicfiles.com/deit/M36_384.pth | 下载权重文件 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_patch16_224-a1311bcf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_tiny_distilled_patch16_224-b40b3cf7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_patch16_224-cd65a155.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_small_distilled_patch16_224-649709d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_384-8de9b5d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_patch16_224-b5f2ef4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_384-d0272ac0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/models.py | https://dl.fbaipublicfiles.com/deit/deit_base_distilled_patch16_224-df68dfff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlpB_24_22k.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_36_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_24_dino.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_no_dist.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/deit_ID2467_for_PyTorch/resmlp_models.py | https://dl.fbaipublicfiles.com/deit/resmlp_12_dist.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..68a2b423558b8e02fdb14d2a5bf60d8b40e92bd9 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/public_address_statement.md @@ -0,0 +1,8 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|----------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/onmt/opts.py | https://arxiv.org/pdf/1803.02155.pdf | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/onmt/opts.py | https://arxiv.org/abs/1909.02074 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/onmt/opts.py | https://arxiv.org/abs/1512.00567 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/onmt/opts.py | https://keras.io/optimizers/ | 优化器链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/onmt/opts.py | https://www.tensorflow.org/api_docs/python/tf/train/Adam | 优化器链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/mBART_ID1550_for_PyTorch/onmt/opts.py | http://www.aclweb.org/anthology/P18-4020 | 参数相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..eb4b72f5ef3ccd8aec407a85789f2776d0242e48 --- /dev/null +++ b/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/public_address_statement.md @@ -0,0 +1,6 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/download_data.sh | http://www.robots.ox.ac.uk/~vgg/share/decathlon-1.0-devkit.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/download_data.sh | http://www.robots.ox.ac.uk/~vgg/share/decathlon-1.0-data.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/matconvnet/cnn_cifar.m | http://www.cs.toronto.edu/~kriz/cifar-10-matlab.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_classification/residual_adapters_ID1598_for_PyTorch/matconvnet/cnn_cifar.m | http://www.cs.toronto.edu/~kriz/cifar-100-matlab.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..a0ebcadffb76653f63604ea3e5b82c2509bb3c9f --- /dev/null +++ b/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/public_address_statement.md @@ -0,0 +1,18 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2011/VOCtrainval_25-May-2011.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2010/VOCtrainval_03-May-2010.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2009/VOCtrainval_11-May-2009.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2008/VOCtrainval_14-Jul-2008.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/mobilenetv2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3+_ID0458_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..76af10447191845ab3ffffffa4b22d7854981e04 --- /dev/null +++ b/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/public_address_statement.md @@ -0,0 +1,18 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------|----------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2012/VOCtrainval_11-May-2012.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2011/VOCtrainval_25-May-2011.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2010/VOCtrainval_03-May-2010.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2009/VOCtrainval_11-May-2009.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2008/VOCtrainval_14-Jul-2008.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/datasets/voc.py | http://host.robots.ox.ac.uk/pascal/VOC/voc2007/VOCtrainval_06-Nov-2007.tar | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/mobilenetv2.py | https://download.pytorch.org/models/mobilenet_v2-b0353104.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnext50_32x4d-7cdf4587.pth | 模型权重 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnext101_32x8d-8ba56ff5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/wide_resnet50_2-95faca4d.pth | 下载预训练模型 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/DeepLabV3_ID0621_for_PyTorch/network/backbone/resnet.py | https://download.pytorch.org/models/wide_resnet101_2-32ee1156.pth | 下载预训练模型 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_segmentation/NAS-SEGM_ID1142_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_segmentation/NAS-SEGM_ID1142_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..e1e778b7ac3ad10147bd937ef2a850a218706dd8 --- /dev/null +++ b/PyTorch/dev/cv/image_segmentation/NAS-SEGM_ID1142_for_PyTorch/public_address_statement.md @@ -0,0 +1,3 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/NAS-SEGM_ID1142_for_PyTorch/src/kd/rf_lw/model_lw_v2.py | https://cloudstor.aarnet.edu.au/plus/s/2w8bFOd45JtPqbD/download | 权重地址 | diff --git a/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..0e6f044a2ff0db5a8586bb38977c7b69aeedb601 --- /dev/null +++ b/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/public_address_statement.md @@ -0,0 +1,20 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------|-----------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/mobilenet.py | https://cloudstor.aarnet.edu.au/plus/s/nQ6wDnTEFhyidot/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/mobilenet.py | https://cloudstor.aarnet.edu.au/plus/s/uRgFbkaRjD3qOg5/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://download.pytorch.org/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://download.pytorch.org/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/xp7GcVKC0GbxhTv/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/mLA7NxVSPjNL7Oo/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/gE8dnQmHr9svpfu/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/2w8bFOd45JtPqbD/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/Ql64rWqiTvWGAA0/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/EkPQzB2KtrrDnKf/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/O84NszlYlsu00fW/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/CPRKWiaCIDRdOwF/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/f1tGGpwdCnYS3xu/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/VnsaSUHNZkuIqeB/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/models/resnet.py | https://cloudstor.aarnet.edu.au/plus/s/hqmplxWOBbOYYjN/download | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/RefineNet_ID1814_for_PyTorch/src/miou_utils.c | vladimir.nekrasov@adelaide.edu.au | Copyright信息 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..20a92fe3a591e0b1d6a0ea48afbf2f7a96911ece --- /dev/null +++ b/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/public_address_statement.md @@ -0,0 +1,6 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------------------------|------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/modeling/backbone/drn.py | https://download.pytorch.org/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/modeling/backbone/drn.py | http://dl.yf.io/drn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/modeling/backbone/drn.py | http://dl.yf.io/drn/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_segmentation/deeplabv3+_ID0326_for_PyTorch/modeling/backbone/resnet.py | https://download.pytorch.org/models/resnet101-5d3b4d8f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/cv/image_synthesis/GAN_ID1931_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/image_synthesis/GAN_ID1931_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..1a1e25f389a13f6884829610d26242d9bff8ba76 --- /dev/null +++ b/PyTorch/dev/cv/image_synthesis/GAN_ID1931_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/image_synthesis/GAN_ID1931_for_PyTorch/data/download_cyclegan_dataset.sh | https://people.eecs.berkeley.edu/~taesung_park/CycleGAN/datasets/$FILE.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/image_synthesis/GAN_ID1931_for_PyTorch/data/download_pix2pix_dataset.sh | https://people.eecs.berkeley.edu/~tinghuiz/projects/pix2pix/datasets/$FILE.tar.gz | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/cv/multimodality/st-gcn_ID2967_for_PyTorch/public_address_statement.md b/PyTorch/dev/cv/multimodality/st-gcn_ID2967_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..8336ec39e2358dde68fb6f9c7f2b2f6dcc20d501 --- /dev/null +++ b/PyTorch/dev/cv/multimodality/st-gcn_ID2967_for_PyTorch/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------|------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/cv/multimodality/st-gcn_ID2967_for_PyTorch/tools/get_models.sh | http://posefs1.perception.cs.cmu.edu/OpenPose/models/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/cv/multimodality/st-gcn_ID2967_for_PyTorch/tools/get_models.sh | https://open-mmlab.s3.ap-northeast-2.amazonaws.com/mmskeleton/models/st-gcn/ | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/public_address_statement.md b/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..c061609464248ae6796c5877298c867ee090f7f6 --- /dev/null +++ b/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/public_address_statement.md @@ -0,0 +1,31 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|--------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://pytorch.org/docs/stable/nn.html#module | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/bert_utils/bertmodel.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT-ITPT-FiT_ID0340_for_PyTorch/download_imdb_dataset.sh | http://ai.stanford.edu/~amaas/data/sentiment/aclImdb_v1.tar.gz --output aclImdb_v1.tar.gz | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/BERT_base_for_PyTorch/public_address_statement.md b/PyTorch/dev/nlp/BERT_base_for_PyTorch/public_address_statement.md index 44b65c38ac745104f5f830e499257e95e383ab97..472e3d974d44a5d498e3c47eda12263431f0beb7 100644 --- a/PyTorch/dev/nlp/BERT_base_for_PyTorch/public_address_statement.md +++ b/PyTorch/dev/nlp/BERT_base_for_PyTorch/public_address_statement.md @@ -1,3 +1,31 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | BERT_base_for_PyTorch/requirements.txt | https://github.com/NVIDIA/dllogger | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------|--------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GLUEDownloader.py | https://gist.githubusercontent.com/W4ngatang/60c2bdb54d156a41194446737ce03e2e/raw/17b8dd0d724281ed7c3b2aeeda662b92809aadd5/download_glue_data.py | 代码下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-24_H-1024_A-16.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_03/multilingual_L-12_H-768_A-12.zip'', ''multilingual_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/cased_L-24_H-1024_A-16.zip'', ''cased_L-24_H-1024_A-16.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/uncased_L-12_H-768_A-12.zip'', ''uncased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_23/multi_cased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_11_03/chinese_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/GooglePretrainedWeightDownloader.py | https://storage.googleapis.com/bert_models/2018_10_18/cased_L-12_H-768_A-12.zip'', ''cased_L-12_H-768_A-12.zip | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/squad/squad_download.sh | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/squad/squad_download.sh | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/ | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0xbcd57bee090b421c982906709c8c27e1/contents/blob/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/SquadDownloader.py | https://worksheets.codalab.org/rest/bundles/0x6b567e1cf2e041ec80d7098f031c5c9e/contents/blob/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/WikiDownloader.py | https://dumps.wikimedia.org/zhwiki/latest/zhwiki-latest-pages-articles.xml.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/data/WikiDownloader.py | https://dumps.wikimedia.org/enwiki/latest/enwiki-latest-pages-articles.xml.bz2 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/modeling.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BERT_base_for_PyTorch/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/BERT_base_for_PyTorch/scripts/run_pretraining_npu_8p.sh b/PyTorch/dev/nlp/BERT_base_for_PyTorch/scripts/run_pretraining_npu_8p.sh index e51dd826c983091b101375748437d2ddbccfef5b..b7c18ca3780f5c7a486bae72863c7ffca94cde78 100644 --- a/PyTorch/dev/nlp/BERT_base_for_PyTorch/scripts/run_pretraining_npu_8p.sh +++ b/PyTorch/dev/nlp/BERT_base_for_PyTorch/scripts/run_pretraining_npu_8p.sh @@ -152,7 +152,7 @@ CMD+=" $INIT_CHECKPOINT" CMD+=" --do_train" CMD+=" --use_npu" CMD+=" --loss_scale=16384.0" -CMD+=" --addr=90.90.176.102" +CMD+=" --addr=x.x.x.x" # change to your address CMD+=" --json-summary ${RESULTS_DIR}/dllogger.json " CMD="python3.7 -u $CMD" diff --git a/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/public_address_statement.md b/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..4003b5c8fc168a2b456ebd0cbc15cbf6fd2f7dfd --- /dev/null +++ b/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/public_address_statement.md @@ -0,0 +1,62 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#module | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/models/transformers/tokenization_bert.py | https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased-vocab.txt | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/cmrc2018_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/cluener_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/wsc_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/tnews_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/iflytek_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/drcd_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/csl_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/copa_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/cmnli_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/chid_public.zip | 数据集地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/BertNER_ID3438_for_PyTorch/tools/download_clue_data.py | https://storage.googleapis.com/cluebenchmark/tasks/afqmc_public.zip | 数据集地址 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/public_address_statement.md b/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..d46b4899acb571f2caa31b5678ae156f19503a21 --- /dev/null +++ b/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/public_address_statement.md @@ -0,0 +1,67 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------------------------------------------------------------------------------|-------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/training-parallel-nc-v13.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt18/translation-task/rapid2016.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/joint_alignment_translation/prepare-wmt18en2de_no_norm_no_escape_no_agressive.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/language_model/prepare-wikitext-103.sh | https://s3.amazonaws.com/research.metamind.io/wikitext/wikitext-103-v1.zip | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/dev_rand_split.jsonl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/train_rand_split.jsonl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://s3.amazonaws.com/commensenseqa/test_rand_split_no_answers.jsonl | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/commonsense_qa/download_cqa_data.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/preprocess_GLUE_tasks.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/roberta/preprocess_RACE.sh | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/dict.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/speech_recognition/datasets/prepare-librispeech.sh | www.openslr.org/resources/12 | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/training-parallel-nc-v12.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2de.sh | http://data.statmt.org/wmt17/translation-task/dev.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt10/training-giga-fren.tar | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/training-parallel-nc-v9.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt14/test-full.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-un.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-europarl-v7.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/examples/translation/prepare-wmt14en2fr.sh | http://statmt.org/wmt13/training-parallel-commoncrawl.tgz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/encoder.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/data/encoders/gpt2_bpe.py | https://dl.fbaipublicfiles.com/fairseq/gpt2_bpe/vocab.bpe | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/bart/model.py | http://dl.fbaipublicfiles.com/fairseq/models/bart.large.cnn.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt17.v2.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.v2.en-fr.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/fconv.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-de.fconv-py.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/data/stories_test.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/fconv_self_att.py | https://dl.fbaipublicfiles.com/fairseq/models/stories_checkpoint.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.large.v0.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/xlmr.base.v0.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.wsc.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.large.mnli.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/roberta.base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/roberta/model.py | http://dl.fbaipublicfiles.com/fairseq/models/camembert.v0.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.ru-en.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-ru.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.en-de.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt19.de-en.joined-dict.single_model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt18.en-de.ensemble.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt16.en-de.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer.py | https://dl.fbaipublicfiles.com/fairseq/models/wmt14.en-fr.joined-dict.transformer.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.ru.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.en.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/wmt19.de.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_wiki103.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/fairseq/models/transformer_lm.py | https://dl.fbaipublicfiles.com/fairseq/models/lm/adaptive_lm_gbw_huge.tar.bz2 | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/FairSeq_Transformer_ID0496_for_PyTorch/setup.py | https://download.pytorch.org/whl/cpu/torch-1.3.0%2Bcpu-cp36-cp36m-linux_x86_64.whl | 三方库链接 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/Retinanet_for_PyTorch/public_address_statement.md b/PyTorch/dev/nlp/Retinanet_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..ea36a4b653bf014140098c668d8814417038d1f4 --- /dev/null +++ b/PyTorch/dev/nlp/Retinanet_for_PyTorch/public_address_statement.md @@ -0,0 +1,7 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------|--------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/nlp/Retinanet_for_PyTorch/nets/resnet.py | http//s3.amazonaws.com/pytorch/models/resnet50-19c8e357.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/Retinanet_for_PyTorch/nets/resnet.py | http//s3.amazonaws.com/pytorch/models/resnet34-333f7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/Retinanet_for_PyTorch/nets/resnet.py | http//s3.amazonaws.com/pytorch/models/resnet18-5c106cde.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/Retinanet_for_PyTorch/nets/resnet.py | http//s3.amazonaws.com/pytorch/models/resnet152-b121ed2d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/Retinanet_for_PyTorch/nets/resnet.py | http//s3.amazonaws.com/pytorch/models/resnet101-5d3b4d8f.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/public_address_statement.md b/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..4a8903867a454fc9ac92481a33387e5846bf2a7d --- /dev/null +++ b/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/public_address_statement.md @@ -0,0 +1,16 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------|----------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/download_finetuned.sh | http://dl.fbaipublicfiles.com/fairseq/models/spanbert_$model.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://dl.fbaipublicfiles.com/fairseq/models/spanbert_hf_base.tar.gz | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/modeling.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/nlp/SpanBERT_ID0337_for_PyTorch/code/pytorch_pretrained_bert/tokenization.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 模型相关说明 | \ No newline at end of file diff --git a/PyTorch/dev/nlp/Textcnn_for_PyTorch/README_ori.md b/PyTorch/dev/nlp/Textcnn_for_PyTorch/README_ori.md index df351264d91e627ccb15d84cb54207eeaa57147e..541daed9e211d0cdb51f395d9a0a0467aab45042 100644 --- a/PyTorch/dev/nlp/Textcnn_for_PyTorch/README_ori.md +++ b/PyTorch/dev/nlp/Textcnn_for_PyTorch/README_ori.md @@ -32,7 +32,6 @@ CNN做句子分类的论文可以参看: [Convolutional Neural Networks for Sent 体育, 财经, 房产, 家居, 教育, 科技, 时尚, 时政, 游戏, 娱乐 ``` -这个子集可以在此下载:链接: https://pan.baidu.com/s/1hugrfRu 密码: qfud 数据集划分如下: diff --git a/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/public_address_statement.md b/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..3ff1d9ce7656a982b288bbf58d24eebf77c657f4 --- /dev/null +++ b/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/public_address_statement.md @@ -0,0 +1,348 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/atss/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/atss/atss_r50_fpn_1x_coco/atss_r50_fpn_1x_coco_20200209-985f7bd0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/atss/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/atss/atss_r101_fpn_1x_coco/atss_r101_fpn_1x_20200825-dfcadd6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/autoassign/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/autoassign/auto_assign_r50_fpn_1x_coco/auto_assign_r50_fpn_1x_coco_20210413_115540-5e17991f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_20e_coco/cascade_rcnn_x101_64x4d_fpn_20e_coco_20200509_224357-051557b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_64x4d_fpn_1x_coco/cascade_rcnn_x101_64x4d_fpn_1x_coco_20200515_075702-43ce6a30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_20e_coco/cascade_rcnn_x101_32x4d_fpn_20e_coco_20200906_134608-9ae0a720.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_x101_32x4d_fpn_1x_coco/cascade_rcnn_x101_32x4d_fpn_1x_coco_20200316-95c2deb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_20e_coco/cascade_rcnn_r50_fpn_20e_coco_bbox_mAP-0.41_20200504_175131-e9872a90.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_fpn_1x_coco/cascade_rcnn_r50_fpn_1x_coco_20200316-3dc56deb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r50_caffe_fpn_1x_coco/cascade_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.404_20200504_174853-b857be87.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_20e_coco/cascade_rcnn_r101_fpn_20e_coco_bbox_mAP-0.425_20200504_231812-5057dcc5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_fpn_1x_coco/cascade_rcnn_r101_fpn_1x_coco_20200317-0b6a2fbf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_rcnn_r101_caffe_fpn_1x_coco/cascade_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.423_20200504_175649-cab8dbd5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco/cascade_mask_rcnn_x101_64x4d_fpn_20e_coco_20200512_161033-bdb5126a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco/cascade_mask_rcnn_x101_64x4d_fpn_1x_coco_20200203-9a2db89d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco/cascade_mask_rcnn_x101_32x4d_fpn_20e_coco_20200528_083917-ed1f4751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_1x_coco_20200201-0f411b1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_20e_coco/cascade_mask_rcnn_r50_fpn_20e_coco_bbox_mAP-0.419__segm_mAP-0.365_20200504_174711-4af8e66e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_fpn_1x_coco/cascade_mask_rcnn_r50_fpn_1x_coco_20200203-9d4dcb24.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r50_caffe_fpn_1x_coco/cascade_mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.412__segm_mAP-0.36_20200504_174659-5004b251.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_20e_coco/cascade_mask_rcnn_r101_fpn_20e_coco_bbox_mAP-0.434__segm_mAP-0.378_20200504_174836-005947da.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_fpn_1x_coco/cascade_mask_rcnn_r101_fpn_1x_coco_20200203-befdf6ee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cascade_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cascade_rcnn/cascade_mask_rcnn_r101_caffe_fpn_1x_coco/cascade_mask_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.432__segm_mAP-0.376_20200504_174813-5c1e9599.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/centernet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_dcnv2_140e_coco/centernet_resnet18_dcnv2_140e_coco_20210520_101209-da388ba2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/centernet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/centernet/centernet_resnet18_140e_coco/centernet_resnet18_140e_coco_20210519_092334-eafe8ccd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/centripetalnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/centripetalnet/centripetalnet_hourglass104_mstest_16x6_210e_coco/centripetalnet_hourglass104_mstest_16x6_210e_coco_20200915_204804-3ccc61e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cityscapes/faster_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cityscapes/mask_rcnn_r50_fpn_1x_cityscapes.py | https://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cornernet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_8x6_210e_coco/cornernet_hourglass104_mstest_8x6_210e_coco_20200825_150618-79b44c30.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cornernet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_32x3_210e_coco/cornernet_hourglass104_mstest_32x3_210e_coco_20200819_203110-1efaea91.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/cornernet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/cornernet/cornernet_hourglass104_mstest_10x5_210e_coco/cornernet_hourglass104_mstest_10x5_210e_coco_20200824_185720-5fefbf1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200203-ad97591f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200203-4d9ad43b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200216-a71f5bce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/faster_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco_20200203-4f85c69c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdpool_1x_coco/faster_rcnn_r50_fpn_mdpool_1x_coco_20200307-c0df27ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_group4_1x_coco_20200130-01262257.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_mdconv_c3-c5_1x_coco_20200130-d099253b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dpool_1x_coco/faster_rcnn_r50_fpn_dpool_1x_coco_20200307-90d3c01d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-d68aed1e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco/faster_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-1377f13d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200130-2f1fca44.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200203-3b2f0594.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_dconv_c3-c5_1x_coco-e75f90c8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r50_fpn_dconv_c3-c5_1x_coco_20200202-42e767a2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dcn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dcn/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco/cascade_mask_rcnn_r101_fpn_dconv_c3-c5_1x_coco_20200204-df0c5f10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_twostage_refine_r50_16x2_50e_coco/deformable_detr_twostage_refine_r50_16x2_50e_coco_20210419_220613-9d28ab72.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_refine_r50_16x2_50e_coco/deformable_detr_refine_r50_16x2_50e_coco_20210419_220503-5f5dff21.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/deformable_detr/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/deformable_detr/deformable_detr_r50_16x2_50e_coco/deformable_detr_r50_16x2_50e_coco_20210419_220030-a12b9512.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/deformable_detr/metafile.yml | https://openreview.net/forum?id=gZ9hCDWe6ke | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detectors/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_sac_1x_coco/htc_r50_sac_1x_coco-bfa60c54.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detectors/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detectors/htc_r50_rfp_1x_coco/htc_r50_rfp_1x_coco-8ff87c51.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detectors/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_htc_r50_1x_coco/detectors_htc_r50_1x_coco-329b1453.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detectors/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detectors/detectors_cascade_rcnn_r50_1x_coco/detectors_cascade_rcnn_r50_1x_coco-32a10ba0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detectors/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_sac_1x_coco/cascade_rcnn_r50_sac_1x_coco-24bfda62.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detectors/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detectors/cascade_rcnn_r50_rfp_1x_coco/cascade_rcnn_r50_rfp_1x_coco-8cf51bfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/detr/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/detr/detr_r50_8x2_150e_coco/detr_r50_8x2_150e_coco_20201130_194835-2c4b8974.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/double_heads/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/double_heads/dh_faster_rcnn_r50_fpn_1x_coco/dh_faster_rcnn_r50_fpn_1x_coco_20200130-586b67df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/dynamic_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/dynamic_rcnn/dynamic_rcnn_r50_fpn_1x/dynamic_rcnn_r50_fpn_1x-62a3f276.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/empirical_attention/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco/faster_rcnn_r50_fpn_attention_1111_dcn_1x_coco_20200130-8b2523a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/empirical_attention/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_1111_1x_coco/faster_rcnn_r50_fpn_attention_1111_1x_coco_20200130-403cccba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/empirical_attention/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco/faster_rcnn_r50_fpn_attention_0010_dcn_1x_coco_20200130-1a2e831d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/empirical_attention/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/empirical_attention/faster_rcnn_r50_fpn_attention_0010_1x_coco/faster_rcnn_r50_fpn_attention_0010_1x_coco_20200130-7cb0c14d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_1x_coco-person-bicycle-car.py | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco/faster_rcnn_r50_caffe_dc5_mstrain_3x_coco_20201028_002107-34a53b2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco/faster_rcnn_r50_caffe_dc5_mstrain_1x_coco_20201028_233851-b33d21b9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_dc5_1x_coco/faster_rcnn_r50_caffe_dc5_1x_coco_20201030_151909-531f0f43.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_2x_coco/faster_rcnn_x101_64x4d_fpn_2x_coco_20200512_161033-5961fa95.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_64x4d_fpn_1x_coco/faster_rcnn_x101_64x4d_fpn_1x_coco_20200204-833ee192.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_2x_coco/faster_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.412_20200506_041400-64a12c0b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_x101_32x4d_fpn_1x_coco/faster_rcnn_x101_32x4d_fpn_1x_coco_20200203-cff10310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_2x_coco/faster_rcnn_r50_fpn_2x_coco_bbox_mAP-0.384_20200504_210434-a5d8aa15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_iou_1x_coco-fdd207f3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_giou_1x_coco-0eada910.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_bounded_iou_1x_coco-98ad993b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_fpn_1x_coco/faster_rcnn_r50_fpn_1x_coco_20200130-047c8118.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco/faster_rcnn_r50_caffe_fpn_mstrain_3x_coco_bbox_mAP-0.398_20200504_163323-30042637.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco/faster_rcnn_r50_caffe_fpn_mstrain_2x_coco_bbox_mAP-0.397_20200504_231813-10b2de58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r50_caffe_fpn_1x_coco/faster_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.378_20200504_180032-c5925ee5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_2x_coco/faster_rcnn_r101_fpn_2x_coco_bbox_mAP-0.398_20200504_210455-1d2dac9c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_fpn_1x_coco/faster_rcnn_r101_fpn_1x_coco_20200130-f513f705.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/faster_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/faster_rcnn/faster_rcnn_r101_caffe_fpn_1x_coco/faster_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.398_20200504_180057-b269e9dd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco/fcos_x101_64x4d_fpn_gn-head_mstrain_640-800_2x_coco-ede514a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r50_caffe_fpn_gn-head_mstrain_640-800_2x_coco-d92ceeea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_r50_caffe_fpn_gn-head_1x_coco/fcos_r50_caffe_fpn_gn-head_1x_coco-821213aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco/fcos_r101_caffe_fpn_gn-head_mstrain_640-800_2x_coco-511424d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_r101_caffe_fpn_gn-head_1x_coco/fcos_r101_caffe_fpn_gn-head_1x_coco-0e37b982.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_dcn_1x_coco-ae4d8b3d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fcos/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/fcos/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco/fcos_center-normbbox-centeronreg-giou_r50_caffe_fpn_gn-head_1x_coco-0a0d75a8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_2x_coco/fovea_r50_fpn_4x4_2x_coco_20200203-2df792b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r50_fpn_4x4_1x_coco/fovea_r50_fpn_4x4_1x_coco_20200219-ee4d5303.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_2x_coco/fovea_r101_fpn_4x4_2x_coco_20200208-02320ea4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_r101_fpn_4x4_1x_coco/fovea_r101_fpn_4x4_1x_coco_20200219-05e38f1c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r50_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200205-85ce26cb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r50_fpn_gn-head_4x4_2x_coco/fovea_align_r50_fpn_gn-head_4x4_2x_coco_20200203-8987880d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco/fovea_align_r101_fpn_gn-head_mstrain_640-800_4x4_2x_coco_20200208-649c5eb6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/foveabox/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/foveabox/fovea_align_r101_fpn_gn-head_4x4_2x_coco/fovea_align_r101_fpn_gn-head_4x4_2x_coco_20200208-c39a027a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fp16/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fp16/retinanet_r50_fpn_fp16_1x_coco/retinanet_r50_fpn_fp16_1x_coco_20200702-0dbfb212.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fp16/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fp16/mask_rcnn_r50_fpn_fp16_1x_coco/mask_rcnn_r50_fpn_fp16_1x_coco_20200205-59faf7e4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fp16/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fp16/faster_rcnn_r50_fpn_fp16_1x_coco/faster_rcnn_r50_fpn_fp16_1x_coco_20200204-d4dc1471.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fpg/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg-chn128_crop640_50e_coco/retinanet_r50_fpg-chn128_crop640_50e_coco-5cf33c76.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fpg/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fpg/retinanet_r50_fpg_crop640_50e_coco/retinanet_r50_fpg_crop640_50e_coco-46fdd1c6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fpg/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg-chn128_crop640_50e_coco/mask_rcnn_r50_fpg-chn128_crop640_50e_coco-5c6ea10d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fpg/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fpg/mask_rcnn_r50_fpg_crop640_50e_coco/mask_rcnn_r50_fpg_crop640_50e_coco-c5860453.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fpg/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg-chn128_crop640_50e_coco/faster_rcnn_r50_fpg-chn128_crop640_50e_coco-24257de9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fpg/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fpg/faster_rcnn_r50_fpg_crop640_50e_coco/faster_rcnn_r50_fpg_crop640_50e_coco-76220505.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/free_anchor/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_x101_32x4d_fpn_1x_coco/retinanet_free_anchor_x101_32x4d_fpn_1x_coco_20200130-d4846968.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/free_anchor/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r50_fpn_1x_coco/retinanet_free_anchor_r50_fpn_1x_coco_20200130-0f67375f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/free_anchor/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/free_anchor/retinanet_free_anchor_r101_fpn_1x_coco/retinanet_free_anchor_r101_fpn_1x_coco_20200130-358324e6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fsaf/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_x101_64x4d_fpn_1x_coco/fsaf_x101_64x4d_fpn_1x_coco-e3f6e6fd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fsaf/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r50_fpn_1x_coco/fsaf_r50_fpn_1x_coco-94ccc51f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/fsaf/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/fsaf/fsaf_r101_fpn_1x_coco/fsaf_r101_fpn_1x_coco-9e71098f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200212-68164964.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-cbed3d2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200211-7584841c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200202-50b90e5c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200202-587b99aa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_1x_coco_20200202-bb3eb55c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r4_gcb_c3-c5_1x_coco_20200204-17235656.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r50_fpn_r16_gcb_c3-c5_1x_coco_20200515_211915-187da160.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200206-8407a3f0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200207-945e77ca.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_1x_coco_20200210-81658c8a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r4_gcb_c3-c5_1x_coco_20200206-af22dc9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco/mask_rcnn_r101_fpn_r16_gcb_c3-c5_1x_coco_20200205-e58ae947.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r4_gcb_c3-c5_1x_coco_20200703_180653-ed035291.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_r16_gcb_c3-c5_1x_coco_20200211-10bf2463.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r4_gcb_c3-c5_1x_coco_20200518_041145-24cabcfd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_r16_gcb_c3-c5_1x_coco_20200516_015634-08f56b56.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_dconv_c3-c5_1x_coco_20200516_182249-680fc3f2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gcnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gcnet/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco/cascade_mask_rcnn_x101_32x4d_fpn_syncbn-backbone_1x_coco_20200310-d5ad2a5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gfl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_mstrain_2x_coco/gfl_x101_32x4d_fpn_mstrain_2x_coco_20200630_102002-50c1ffdb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gfl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco/gfl_x101_32x4d_fpn_dconv_c4-c5_mstrain_2x_coco_20200630_102002-14a2bf25.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gfl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_mstrain_2x_coco/gfl_r50_fpn_mstrain_2x_coco_20200629_213802-37bb1edc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gfl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r50_fpn_1x_coco/gfl_r50_fpn_1x_coco_20200629_121244-25944287.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gfl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gfl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ghm/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_64x4d_fpn_1x_coco/retinanet_ghm_x101_64x4d_fpn_1x_coco_20200131-dd381cef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ghm/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_x101_32x4d_fpn_1x_coco/retinanet_ghm_x101_32x4d_fpn_1x_coco_20200131-e4333bd0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ghm/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r50_fpn_1x_coco/retinanet_ghm_r50_fpn_1x_coco_20200130-a437fda3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ghm/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ghm/retinanet_ghm_r101_fpn_1x_coco/retinanet_ghm_r101_fpn_1x_coco_20200130-c148ee8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco/mask_rcnn_r50_fpn_gn-all_contrib_3x_coco_20200225-542aefbc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco/mask_rcnn_r50_fpn_gn-all_contrib_2x_coco_20200207-20d3e849.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_3x_coco/mask_rcnn_r50_fpn_gn-all_3x_coco_20200214-8b23b1e5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r50_fpn_gn-all_2x_coco/mask_rcnn_r50_fpn_gn-all_2x_coco_20200206-8eee02a6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_3x_coco/mask_rcnn_r101_fpn_gn-all_3x_coco_20200513_181609-0df864f4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn/mask_rcnn_r101_fpn_gn-all_2x_coco/mask_rcnn_r101_fpn_gn-all_2x_coco_20200205-d96b1b50.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_2x_coco_20200216-649fdb6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x50_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200226-969bcb2c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_2x_coco_20200319-33fb95b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_x101_32x4d_fpn_gn_ws-all_20_23_24e_coco_20200316-e6cd35ef.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_2x_coco/mask_rcnn_r50_fpn_gn_ws-all_2x_coco_20200226-16acb762.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r50_fpn_gn_ws-all_20_23_24e_coco_20200213-487d1283.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_2x_coco/mask_rcnn_r101_fpn_gn_ws-all_2x_coco_20200212-ea357cd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco/mask_rcnn_r101_fpn_gn_ws-all_20_23_24e_coco_20200213-57b5a50f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x50_32x4d_fpn_gn_ws-all_1x_coco_20200203-839c5d9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco/faster_rcnn_x101_32x4d_fpn_gn_ws-all_1x_coco_20200212-27da1bc2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r50_fpn_gn_ws-all_1x_coco/faster_rcnn_r50_fpn_gn_ws-all_1x_coco_20200130-613d9fe2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/gn+ws/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/gn%2Bws/faster_rcnn_r101_fpn_gn_ws-all_1x_coco/faster_rcnn_r101_fpn_gn_ws-all_1x_coco_20200205-a93b0d75.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/grid_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_64x4d_fpn_gn-head_2x_coco_20200204-ec76a754.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/grid_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco/grid_rcnn_x101_32x4d_fpn_gn-head_2x_coco_20200130-d8f0e3ff.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/grid_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r50_fpn_gn-head_2x_coco/grid_rcnn_r50_fpn_gn-head_2x_coco_20200130-6cca8223.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/grid_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/grid_rcnn/grid_rcnn_r101_fpn_gn-head_2x_coco/grid_rcnn_r101_fpn_gn-head_2x_coco_20200309-d6eca030.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/groie/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r50_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200604_211715-42eb79e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/groie/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r50_fpn_groie_1x_coco/mask_rcnn_r50_fpn_groie_1x_coco_20200604_211715-50d90c74.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/groie/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/groie/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco/mask_rcnn_r101_fpn_syncbn-backbone_r4_gcb_c3-c5_groie_1x_coco_20200607_224507-8daae01c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/groie/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/groie/faster_rcnn_r50_fpn_groie_1x_coco/faster_rcnn_r50_fpn_groie_1x_coco_20200604_211715-66ee9516.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r50_caffe_fpn_1x_coco/ga_retinanet_r50_caffe_fpn_1x_coco_20201020-39581c6f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_64x4d_fpn_1x_coco/ga_retinanet_x101_64x4d_fpn_1x_coco_20200226-ef9f7f1f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_x101_32x4d_fpn_1x_coco/ga_retinanet_x101_32x4d_fpn_1x_coco_20200219-40c56caa.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_retinanet_r101_caffe_fpn_1x_coco/ga_retinanet_r101_caffe_fpn_1x_coco_20200531-6266453c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_64x4d_fpn_1x_coco/ga_faster_x101_64x4d_fpn_1x_coco_20200215-0fa7bde7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_x101_32x4d_fpn_1x_coco/ga_faster_x101_32x4d_fpn_1x_coco_20200215-1ded9da3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r50_caffe_fpn_1x_coco/ga_faster_r50_caffe_fpn_1x_coco_20200702_000718-a11ccfe6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/guided_anchoring/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/guided_anchoring/ga_faster_r101_caffe_fpn_1x_coco/ga_faster_r101_caffe_fpn_1x_coco_bbox_mAP-0.415_20200505_115528-fb82e499.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco/fcos_hrnetv2p_w18_gn-head_4x4_1x_coco_20201212_100710-4ad151de.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_2x_coco/mask_rcnn_hrnetv2p_w40_2x_coco_20200512_163732-aed5e4ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w40_1x_coco/mask_rcnn_hrnetv2p_w40_1x_coco_20200511_015646-66738b35.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_2x_coco/mask_rcnn_hrnetv2p_w32_2x_coco_20200213-45b75b4d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w32_1x_coco/mask_rcnn_hrnetv2p_w32_1x_coco_20200207-b29f616e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_2x_coco/mask_rcnn_hrnetv2p_w18_2x_coco_20200212-b3c825b1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/mask_rcnn_hrnetv2p_w18_1x_coco/mask_rcnn_hrnetv2p_w18_1x_coco_20200205-1c3d78ed.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w40_20e_coco/htc_hrnetv2p_w40_20e_coco_20200529_183411-417c4d5b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w32_20e_coco/htc_hrnetv2p_w32_20e_coco_20200207-7639fa12.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/htc_hrnetv2p_w18_20e_coco/htc_hrnetv2p_w18_20e_coco_20200210-b266988c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w40_gn-head_mstrain_640-800_4x4_2x_coco_20201212_124752-f22d2ce5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_mstrain_640-800_4x4_2x_coco_20201212_090846-b6f2b49f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco/fcos_hrnetv2p_w32_gn-head_4x4_2x_coco_20201212_112133-77b6b9bb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco/fcos_hrnetv2p_w32_gn-head_4x4_1x_coco_20201211_134730-cb8055c0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_mstrain_640-800_4x4_2x_coco_20201212_111651-441e9d9f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco/fcos_hrnetv2p_w18_gn-head_4x4_2x_coco_20201212_101110-5c575fa5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_2x_coco/faster_rcnn_hrnetv2p_w40_2x_coco_20200512_161033-0f236ef4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w40_1x_coco/faster_rcnn_hrnetv2p_w40_1x_coco_20200210-95c1f5ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_2x_coco/faster_rcnn_hrnetv2p_w32_2x_coco_20200529_015927-976a9c15.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w32_1x_coco/faster_rcnn_hrnetv2p_w32_1x_coco_20200130-6e286425.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_2x_coco/faster_rcnn_hrnetv2p_w18_2x_coco_20200702_085731-a4ec0611.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/faster_rcnn_hrnetv2p_w18_1x_coco/faster_rcnn_hrnetv2p_w18_1x_coco_20200130-56651a6d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w40_20e_coco/cascade_rcnn_hrnetv2p_w40_20e_coco_20200512_161112-75e47b04.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w32_20e_coco/cascade_rcnn_hrnetv2p_w32_20e_coco_20200208-928455a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_rcnn_hrnetv2p_w18_20e_coco/cascade_rcnn_hrnetv2p_w18_20e_coco_20200210-434be9d7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w40_20e_coco/cascade_mask_rcnn_hrnetv2p_w40_20e_coco_20200527_204922-969c4610.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w32_20e_coco/cascade_mask_rcnn_hrnetv2p_w32_20e_coco_20200512_154043-39d9cf7b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/hrnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/hrnet/cascade_mask_rcnn_hrnetv2p_w18_20e_coco/cascade_mask_rcnn_hrnetv2p_w18_20e_coco_20200210-b543cd2b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/htc/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco/htc_x101_64x4d_fpn_dconv_c3-c5_mstrain_400_1400_16x1_20e_coco_20200312-946fd751.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/htc/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_64x4d_fpn_16x1_20e_coco/htc_x101_64x4d_fpn_16x1_20e_coco_20200318-b181fd7a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/htc/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_x101_32x4d_fpn_16x1_20e_coco/htc_x101_32x4d_fpn_16x1_20e_coco_20200318-de97ae01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/htc/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_20e_coco/htc_r50_fpn_20e_coco_20200319-fe28c577.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/htc/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_r50_fpn_1x_coco/htc_r50_fpn_1x_coco_20200317-7332cf16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/htc/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/htc/htc_r101_fpn_20e_coco/htc_r101_fpn_20e_coco_20200317-9b41b48f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/instaboost/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco/mask_rcnn_x101_64x4d_fpn_instaboost_4x_coco_20200515_080947-8ed58c1b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/instaboost/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r50_fpn_instaboost_4x_coco/mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-d025f83a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/instaboost/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/instaboost/mask_rcnn_r101_fpn_instaboost_4x_coco/mask_rcnn_r101_fpn_instaboost_4x_coco_20200703_235738-f23f3a5f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/instaboost/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/instaboost/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco/cascade_mask_rcnn_r50_fpn_instaboost_4x_coco_20200307-c19d98d9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ld/ld_r101_gflv1_r101dcn_fpn_coco_2x.py | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco/gfl_r101_fpn_dconv_c3-c5_mstrain_2x_coco_20200630_102002-134b07df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ld/ld_r18_gflv1_r101_fpn_coco_1x.py | http://download.openmmlab.com/mmdetection/v2.0/gfl/gfl_r101_fpn_mstrain_2x_coco/gfl_r101_fpn_mstrain_2x_coco_20200629_200126-dd12f847.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/libra_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_retinanet_r50_fpn_1x_coco/libra_retinanet_r50_fpn_1x_coco_20200205-804d94ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/libra_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_x101_64x4d_fpn_1x_coco/libra_faster_rcnn_x101_64x4d_fpn_1x_coco_20200315-3a7d0488.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/libra_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r50_fpn_1x_coco/libra_faster_rcnn_r50_fpn_1x_coco_20200130-3afee3a9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/libra_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/libra_rcnn/libra_faster_rcnn_r101_fpn_1x_coco/libra_faster_rcnn_r101_fpn_1x_coco_20200203-8dba6a5a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_2x_coco/mask_rcnn_x101_64x4d_fpn_2x_coco_20200509_224208-39d6f70c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_64x4d_fpn_1x_coco/mask_rcnn_x101_64x4d_fpn_1x_coco_20200201-9352eb0d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_2x_coco/mask_rcnn_x101_32x4d_fpn_2x_coco_bbox_mAP-0.422__segm_mAP-0.378_20200506_004702-faef898c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_x101_32x4d_fpn_1x_coco/mask_rcnn_x101_32x4d_fpn_1x_coco_20200205-478d0b67.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_2x_coco/mask_rcnn_r50_fpn_2x_coco_bbox_mAP-0.392__segm_mAP-0.354_20200505_003907-3e542a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_fpn_1x_coco/mask_rcnn_r50_fpn_1x_coco_20200205-d4b0c5d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_3x_coco_bbox_mAP-0.408__segm_mAP-0.37_20200504_163245-42aa3d00.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco/mask_rcnn_r50_caffe_fpn_mstrain-poly_2x_coco_bbox_mAP-0.403__segm_mAP-0.365_20200504_231822-a75c98ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r50_caffe_fpn_1x_coco/mask_rcnn_r50_caffe_fpn_1x_coco_bbox_mAP-0.38__segm_mAP-0.344_20200504_231812-0ebd1859.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_2x_coco/mask_rcnn_r101_fpn_2x_coco_bbox_mAP-0.408__segm_mAP-0.366_20200505_071027-14b391c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_fpn_1x_coco/mask_rcnn_r101_fpn_1x_coco_20200204-1efe0ed5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/mask_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/mask_rcnn/mask_rcnn_r101_caffe_fpn_1x_coco/mask_rcnn_r101_caffe_fpn_1x_coco_20200601_095758-805e06c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_2x_coco/ms_rcnn_x101_64x4d_fpn_2x_coco_20200308-02a445e2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_64x4d_fpn_1x_coco/ms_rcnn_x101_64x4d_fpn_1x_coco_20200206-86ba88d2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_x101_32x4d_fpn_1x_coco/ms_rcnn_x101_32x4d_fpn_1x_coco_20200206-81fd1740.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_2x_coco/ms_rcnn_r50_caffe_fpn_2x_coco_bbox_mAP-0.388__segm_mAP-0.363_20200506_004738-ee87b137.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r50_caffe_fpn_1x_coco/ms_rcnn_r50_caffe_fpn_1x_coco_20200702_180848-61c9355e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_2x_coco/ms_rcnn_r101_caffe_fpn_2x_coco_bbox_mAP-0.411__segm_mAP-0.381_20200506_011134-5f3cc74f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ms_rcnn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ms_rcnn/ms_rcnn_r101_caffe_fpn_1x_coco/ms_rcnn_r101_caffe_fpn_1x_coco_bbox_mAP-0.404__segm_mAP-0.376_20200506_004755-b9b12a37.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/nas_fcos/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_nashead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200520-1bdba3ce.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/nas_fcos/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/nas_fcos/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco/nas_fcos_fcoshead_r50_caffe_fpn_gn-head_4x4_1x_coco_20200521-7fdcbce0.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/nas_fpn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_nasfpn_crop640_50e_coco/retinanet_r50_nasfpn_crop640_50e_coco-0ad1f644.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/nas_fpn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/nas_fpn/retinanet_r50_fpn_crop640_50e_coco/retinanet_r50_fpn_crop640_50e_coco-9b953d76.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_mstrain_3x_coco/paa_r50_fpn_mstrain_3x_coco_20210121_145722-06a6880b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_2x_coco/paa_r50_fpn_2x_coco_20200821-c98bfc4e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1x_coco/paa_r50_fpn_1x_coco_20200821-936edec3.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r50_fpn_1.5x_coco/paa_r50_fpn_1.5x_coco_20200823-805d6078.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_mstrain_3x_coco/paa_r101_fpn_mstrain_3x_coco_20210122_084202-83250d22.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_2x_coco/paa_r101_fpn_2x_coco_20200821-6829f96b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/paa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/paa/paa_r101_fpn_1x_coco/paa_r101_fpn_1x_coco_20200821-0a1825a4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pafpn/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pafpn/faster_rcnn_r50_pafpn_1x_coco/faster_rcnn_r50_pafpn_1x_coco_bbox_mAP-0.375_20200503_105836-b7b4b9bd.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd512_coco/pisa_ssd512_coco-247addee.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_ssd300_coco/pisa_ssd300_coco-710e3ac9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_x101_32x4d_fpn_1x_coco/pisa_retinanet_x101_32x4d_fpn_1x_coco-a0c13c73.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_retinanet_r50_fpn_1x_coco/pisa_retinanet_r50_fpn_1x_coco-76409952.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_mask_rcnn_r50_fpn_1x_coco/pisa_mask_rcnn_r50_fpn_1x_coco-dfcedba6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco/pisa_faster_rcnn_x101_32x4d_fpn_1x_coco-e4accec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/pisa/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/pisa/pisa_faster_rcnn_r50_fpn_1x_coco/pisa_faster_rcnn_r50_fpn_1x_coco-dea93523.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/point_rend/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_3x_coco/point_rend_r50_caffe_fpn_mstrain_3x_coco-e0ebb6b7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/point_rend/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/point_rend/point_rend_r50_caffe_fpn_mstrain_1x_coco/point_rend_r50_caffe_fpn_mstrain_1x_coco-1bcb5fb4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-800MF_fpn_1x_coco/retinanet_regnetx-800MF_fpn_1x_coco_20200517_191403-f6f91d10.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-3.2GF_fpn_1x_coco/retinanet_regnetx-3.2GF_fpn_1x_coco_20200520_163141-cb1509e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/retinanet_regnetx-1.6GF_fpn_1x_coco/retinanet_regnetx-1.6GF_fpn_1x_coco_20200517_191403-37009a9d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-8GF_fpn_1x_coco/mask_rcnn_regnetx-8GF_fpn_1x_coco_20200517_180515-09daa87e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-6.4GF_fpn_1x_coco/mask_rcnn_regnetx-6.4GF_fpn_1x_coco_20200517_180439-3a7aae83.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-4GF_fpn_1x_coco/mask_rcnn_regnetx-4GF_fpn_1x_coco_20200517_180217-32e9c92d.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/mask_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco_20200521_202221-99879813.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_mdconv_c3-c5_1x_coco_20200520_172726-75f40794.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-3.2GF_fpn_1x_coco/mask_rcnn_regnetx-3.2GF_fpn_1x_coco_20200520_163141-2a9d1814.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/mask_rcnn_regnetx-12GF_fpn_1x_coco/mask_rcnn_regnetx-12GF_fpn_1x_coco_20200517_180552-b538bd8b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco/faster_rcnn_regnetx-3.2GF_fpn_mstrain_3x_coco_20200520_224253-bf85ae3e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_2x_coco/faster_rcnn_regnetx-3.2GF_fpn_2x_coco_20200520_223955-e2081918.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/regnet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/regnet/faster_rcnn_regnetx-3.2GF_fpn_1x_coco/faster_rcnn_regnetx-3.2GF_fpn_1x_coco_20200517_175927-126fd9bf.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_x101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-f87da1ea.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_2x_coco_20200329-91babaa2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco/reppoints_moment_r50_fpn_gn-neck%2Bhead_1x_coco_20200329-4b38409a.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r50_fpn_1x_coco/reppoints_moment_r50_fpn_1x_coco_20200330-b73db8d1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_gn-neck%2Bhead_2x_coco_20200329-4fbc7310.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco/reppoints_moment_r101_fpn_dconv_c3-c5_gn-neck%2Bhead_2x_coco_20200329-3309fbf2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_fpn_gn-neck%2Bhead_1x_coco_20200329-c98bfa96.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/reppoints/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/reppoints/bbox_r50_grid_center_fpn_gn-neck%2Bhead_1x_coco/bbox_r50_grid_center_fpn_gn-neck%2Bhead_1x_coco_20200330-00f73d58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/res2net/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/res2net/mask_rcnn_r2_101_fpn_2x_coco/mask_rcnn_r2_101_fpn_2x_coco-17f061e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/res2net/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/res2net/htc_r2_101_fpn_20e_coco/htc_r2_101_fpn_20e_coco-3a8d2112.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/res2net/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/res2net/faster_rcnn_r2_101_fpn_2x_coco/faster_rcnn_r2_101_fpn_2x_coco-175f1da6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/res2net/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_rcnn_r2_101_fpn_20e_coco/cascade_rcnn_r2_101_fpn_20e_coco-f4b7b7db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/res2net/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/res2net/cascade_mask_rcnn_r2_101_fpn_20e_coco/cascade_mask_rcnn_r2_101_fpn_20e_coco-8a7b41e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20200926_125503-8a2c3d47.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_215831-af60cdf9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20200926_125502-20289c16.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/faster_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201006_021058-421517f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201122_213640-763cc7b5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco/cascade_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain-range_1x_coco_20201005_113242-b9459f8f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s50_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201122_104428-99eca4c7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/resnest/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/resnest/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco/cascade_mask_rcnn_s101_fpn_syncbn-backbone%2Bhead_mstrain_1x_coco_20201005_113243-42607475.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_2x_coco/retinanet_x101_64x4d_fpn_2x_coco_20200131-bca068ab.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_64x4d_fpn_1x_coco/retinanet_x101_64x4d_fpn_1x_coco_20200130-366f5af1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_2x_coco/retinanet_x101_32x4d_fpn_2x_coco_20200131-237fc5e1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_x101_32x4d_fpn_1x_coco/retinanet_x101_32x4d_fpn_1x_coco_20200130-5c8b7ec4.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_2x_coco/retinanet_r50_fpn_2x_coco_20200131-fdb43119.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_fpn_1x_coco/retinanet_r50_fpn_1x_coco_20200130-c2398f9e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r50_caffe_fpn_1x_coco/retinanet_r50_caffe_fpn_1x_coco_20200531-f11027c5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_2x_coco/retinanet_r101_fpn_2x_coco_20200131-5560aee8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_fpn_1x_coco/retinanet_r101_fpn_1x_coco_20200130-7a93545f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/retinanet/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/retinanet/retinanet_r101_caffe_fpn_1x_coco/retinanet_r101_caffe_fpn_1x_coco_20200531-b428fa0f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_gn_1x_coco/sabl_retinanet_r50_fpn_gn_1x_coco-e16dfcf1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r50_fpn_1x_coco/sabl_retinanet_r50_fpn_1x_coco-6c54fd4f.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco/sabl_retinanet_r101_fpn_gn_2x_ms_640_800_coco-1e63382c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco/sabl_retinanet_r101_fpn_gn_2x_ms_480_960_coco-5342f857.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_gn_1x_coco/sabl_retinanet_r101_fpn_gn_1x_coco-40a893e8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_retinanet_r101_fpn_1x_coco/sabl_retinanet_r101_fpn_1x_coco-42026904.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r50_fpn_1x_coco/sabl_faster_rcnn_r50_fpn_1x_coco-e867595b.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_faster_rcnn_r101_fpn_1x_coco/sabl_faster_rcnn_r101_fpn_1x_coco-f804c6c1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r50_fpn_1x_coco/sabl_cascade_rcnn_r50_fpn_1x_coco-e1748e5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sabl/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/sabl/sabl_cascade_rcnn_r101_fpn_1x_coco/sabl_cascade_rcnn_r101_fpn_1x_coco-2b83e87c.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/scratch/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/scratch/mask_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_mask_rcnn_r50_fpn_gn_6x_bbox_mAP-0.412__segm_mAP-0.374_20200201_193051-1e190a40.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/scratch/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/scratch/faster_rcnn_r50_fpn_gn-all_scratch_6x_coco/scratch_faster_rcnn_r50_fpn_gn_6x_bbox_mAP-0.407_20200201_193013-90813d01.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_mstrain_480-800_3x_coco_20201218_154234-7bc5c054.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r50_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_024605-9fe92701.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r50_fpn_1x_coco/sparse_rcnn_r50_fpn_1x_coco_20201222_214453-dc79b137.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_mstrain_480-800_3x_coco_20201223_121552-6c46c9d6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/sparse_rcnn/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/sparse_rcnn/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco/sparse_rcnn_r101_fpn_300_proposals_crop_mstrain_480-800_3x_coco_20201223_023452-c23c3564.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ssd/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ssd/ssd512_coco/ssd512_coco_20200308-038c5591.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/ssd/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/ssd/ssd300_coco/ssd300_coco_20200307-a92d2092.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_3x_coco/tridentnet_r50_caffe_mstrain_3x_coco_20201130_100539-46d227ba.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_mstrain_1x_coco/tridentnet_r50_caffe_mstrain_1x_coco_20201230_141839-6ce55ccb.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/tridentnet/metafile.yml | https://download.openmmlab.com/mmdetection/v2.0/tridentnet/tridentnet_r50_caffe_1x_coco/tridentnet_r50_caffe_1x_coco_20201230_141838-2ec0b530.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_64x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-b5f6da5e.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_x101_32x4d_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-d300a6fc.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mstrain_2x_coco/vfnet_r50_fpn_mstrain_2x_coco_20201027-7cc75bd2.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r50_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-6879c318.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_r50_fpn_1x_coco/vfnet_r50_fpn_1x_coco_20201027-38db6f58.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mstrain_2x_coco/vfnet_r101_fpn_mstrain_2x_coco_20201027pth-4a5d53f1.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco/vfnet_r101_fpn_mdconv_c3-c5_mstrain_2x_coco_20201027pth-7729adb5.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/vfnet/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/vfnet/vfnet_r101_fpn_1x_coco/vfnet_r101_fpn_1x_coco_20201027pth-c831ece7.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolact/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/yolact/yolact_r50_8x8_coco_20200908-ca34f5db.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolact/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/yolact/yolact_r50_1x8_coco_20200908-f38d58df.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolact/metafile.yml | https://openmmlab.oss-cn-hangzhou.aliyuncs.com/mmdetection/v2.0/yolact/yolact_r101_1x8_coco_20200908-4cbe9101.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolo/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-608_273e_coco/yolov3_d53_mstrain-608_273e_coco_20210518_115020-a2c3acb8.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolo/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_mstrain-416_273e_coco/yolov3_d53_mstrain-416_273e_coco-2b60fcd9.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolo/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_fp16_mstrain-608_273e_coco/yolov3_d53_fp16_mstrain-608_273e_coco_20210517_213542-4bc34944.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolo/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/yolo/yolov3_d53_320_273e_coco/yolov3_d53_320_273e_coco-421362b6.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/configs/yolof/metafile.yml | http://download.openmmlab.com/mmdetection/v2.0/yolof/yolof_r50_c5_8x8_1x_coco/yolof_r50_c5_8x8_1x_coco_20210425_024427-8e864411.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/docker/Dockerfile | https://openmmlab.oss-accelerate.aliyuncs.com/mmcv/dist/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/docker/serve/Dockerfile | https://download.openmmlab.com/mmcv/dist/cu101/torch1.6.0/index.html | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/mmdet/core/export/model_wrappers.py | https://mmcv.readthedocs.io/en/latest/tensorrt_plugin.html#how-to-build-tensorrt-plugins-in-mmcv | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/CascadeMaskRCNN_iflytek_for_PyTorch/setup.py | openmmlab@gmail.com | 作者邮箱 | \ No newline at end of file diff --git a/PyTorch/dev/perf/DLRM/public_address_statement.md b/PyTorch/dev/perf/DLRM/public_address_statement.md index 34bd0496d39670e57c7dadcffdad52f682c7367a..62e2ff41f8e4d22cea9c18000d4eb6b7af68352b 100644 --- a/PyTorch/dev/perf/DLRM/public_address_statement.md +++ b/PyTorch/dev/perf/DLRM/public_address_statement.md @@ -1,3 +1,6 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -|--------|----------------------------------------------------------------------------------------------------------------------------------|----------------------------------------------------|-----------------------|--------| -| 开发引入 | / | DLRM/requirements.txt | https://github.com/NVIDIA/dllogger#egg=dllogger | 相关依赖 | +| 文件位置 | 公网地址 | 公网地址用途 | +|----------------------------------------------------------------------------|---------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/dev/perf/DLRM/Dockerfile_preprocessing | https://developer.download.nvidia.com/compute/cuda/repos/ubuntu1804/x86_64/3bf863cc.pub | ubuntu镜像地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/DLRM/Dockerfile_preprocessing | https://repo1.maven.org/maven2/com/nvidia/rapids-4-spark_2.12/0.4.0/rapids-4-spark_2.12-0.4.0.jar | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/DLRM/Dockerfile_preprocessing | https://repo1.maven.org/maven2/ai/rapids/cudf/0.18.1/cudf-0.18.1-cuda11.jar | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/DLRM/preproc/verify_criteo_downloaded.sh | http://labs.criteo.com/2013/12/download-terabyte-click-logs/ | 下载链接 | \ No newline at end of file diff --git a/PyTorch/dev/perf/ShuffleNetV2_iflytek_for_Pytorch/public_address_statement.md b/PyTorch/dev/perf/ShuffleNetV2_iflytek_for_Pytorch/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..8014a441dcd1e51cac4c8565c023d9cadc82cabc --- /dev/null +++ b/PyTorch/dev/perf/ShuffleNetV2_iflytek_for_Pytorch/public_address_statement.md @@ -0,0 +1,5 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------------------------------|----------------------------------------------------------------------|--------------| +| ModelZoo-PyTorch/PyTorch/dev/perf/ShuffleNetV2_iflytek_for_Pytorch/main.py | tcp://224.66.41.62:23456 | 分布式配置ip默认入参 | +| ModelZoo-PyTorch/PyTorch/dev/perf/ShuffleNetV2_iflytek_for_Pytorch/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x1-5666bf0f80.pth | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/ShuffleNetV2_iflytek_for_Pytorch/models/shufflenetv2.py | https://download.pytorch.org/models/shufflenetv2_x0.5-f707e7126e.pth | 权重地址 | \ No newline at end of file diff --git a/PyTorch/dev/perf/chineseglue_tnew_bert/public_address_statement.md b/PyTorch/dev/perf/chineseglue_tnew_bert/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..cc99c065fd2222289fb51a8a10f35507878f2b20 --- /dev/null +++ b/PyTorch/dev/perf/chineseglue_tnew_bert/public_address_statement.md @@ -0,0 +1,164 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|---------------------------------------------------------------------------------------------------------------------|-----------------------------------------------------------------------------------------------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/metrics/glue_compute_metrics.py | https://scikit-learn.org/stable/index.html | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/__main__.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/__main__.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/__main__.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/__main__.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-config.json | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-config.json | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-config.json | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_ctrl.py | https://storage.googleapis.com/sf-ctrl/pytorch/ctrl-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_distilbert.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-distilled-squad-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_distilbert.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_openai.py | https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_transfo_xl.py | https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-config.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-enfr-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-ende-1024-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlnet.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/configuration_xlnet.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-config.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_albert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_albert.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-pytorch_model.bin | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-pytorch_model.bin | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-pytorch_model.bin | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_ctrl.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_ctrl.py | https://storage.googleapis.com/sf-ctrl/pytorch/seqlen256_v1.bin | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_distilbert.py | https://medium.com/huggingface/distilbert-8cf3380435b5 | 相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_distilbert.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-distilled-squad-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_distilbert.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilbert-base-uncased-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://openai.com/blog/better-language-models/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_openai.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_openai.py | https://openai.com/blog/language-unsupervised/ | 相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_openai.py | https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_roberta.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_transfo_xl.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_transfo_xl.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_transfo_xl.py | https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-pytorch_model.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_utils.py | https://www.tensorflow.org/install/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_utils.py | https://pytorch.org/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-enfr-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-ende-1024-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlnet.py | https://www.tensorflow.org/install/ | 三方库install | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlnet.py | https://pytorch.org/docs/stable/nn.html#module | 三方库说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlnet.py | http://arxiv.org/abs/1906.08237 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlnet.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/modeling_xlnet.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-pytorch_model.bin | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-uncased-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-chinese-vocab.txt | 相关配置 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-cased-finetuned-mrpc-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://int-deepset-models-bert.s3.eu-central-1.amazonaws.com/pytorch/bert-base-german-cased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-cased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-german-dbmdz-uncased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-multilingual-cased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-cased-whole-word-masking-finetuned-squad-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_bert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-whole-word-masking-finetuned-squad-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_ctrl.py | https://raw.githubusercontent.com/salesforce/ctrl/master/ctrl-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_distilbert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-base-uncased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_distilbert.py | https://s3.amazonaws.com/models.huggingface.co/bert/bert-large-uncased-vocab.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-merges.txt | 下载链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/distilgpt2-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-merges.txt | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-vocab.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-merges.txt | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-large-vocab.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-merges.txt | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_gpt2.py | https://s3.amazonaws.com/models.huggingface.co/bert/gpt2-medium-vocab.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_openai.py | https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-merges.txt | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_openai.py | https://s3.amazonaws.com/models.huggingface.co/bert/openai-gpt-vocab.json | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-mnli-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-large-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_roberta.py | https://s3.amazonaws.com/models.huggingface.co/bert/roberta-base-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_transfo_xl.py | https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-corpus.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_transfo_xl.py | https://s3.amazonaws.com/models.huggingface.co/bert/transfo-xl-wt103-vocab.bin | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-enfr-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-clm-ende-1024-vocab.json | 权重地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-xnli15-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-tlm-xnli15-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enro-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-en-2048-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-17-1280-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-100-1280-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-enfr-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlm.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlm-mlm-ende-1024-merges.txt | 模型相关说明 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlnet.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-large-cased-spiece.model | 词表链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/chineseglue_tnew_bert/classifier_pytorch/transformers/tokenization_xlnet.py | https://s3.amazonaws.com/models.huggingface.co/bert/xlnet-base-cased-spiece.model | 词表链接 | \ No newline at end of file diff --git a/PyTorch/dev/perf/mlperf_bert/public_address_statement.md b/PyTorch/dev/perf/mlperf_bert/public_address_statement.md index 141d41bde40a1f96d85dc1d04ea9ee7aa7ab077b..c36e3ed6e06dede13818045bfe28cfeba6d2258f 100644 --- a/PyTorch/dev/perf/mlperf_bert/public_address_statement.md +++ b/PyTorch/dev/perf/mlperf_bert/public_address_statement.md @@ -1,129 +1,3 @@ -| 类型 | 开源代码地址 | 文件名 | 公网IP地址/公网URL地址/域名/邮箱地址 | 用途说明 | -| ---- | ------------ | ------ | ------------------------------------ | -------- | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=624 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=627 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=628 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=630 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=632 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=633 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=634 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=639 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=640 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=642 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=643 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=649 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=651 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=653 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=655 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=656 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=657 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=659 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=661 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=662 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=663 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=664 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=665 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=666 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=670 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=673 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=674 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=675 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=676 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=677 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=679 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=680 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=681 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=682 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=683 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=689 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=690 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=691 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=694 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=698 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_01 | https://en.wikipedia.org/wiki?curid=700 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=701 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=704 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=705 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=706 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=708 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=709 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=710 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=711 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=713 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=717 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=728 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=734 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=736 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=737 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=738 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=740 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=742 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=746 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=748 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=751 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=752 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=764 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=765 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=766 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=771 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=772 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=775 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=777 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=779 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=780 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=782 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=783 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=784 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=785 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=786 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=787 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=788 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=789 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=791 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=794 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=795 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=798 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/cleanup_scripts/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=799 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=701 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=704 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=705 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=706 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=708 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=709 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=710 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=711 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=713 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=717 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=728 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=734 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=736 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=737 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=738 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=740 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=742 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=746 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=748 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=751 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=752 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=764 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=765 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=766 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=771 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=772 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=775 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=777 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=779 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=780 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=782 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=783 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=784 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=785 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=786 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=787 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=788 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=789 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=791 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=794 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=795 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=798 | 百科介绍 | -| 开源代码引入 | https://github.com/sharathts/training.git | mlperf_bert/pytorch/input_preprocessing/sample_data/wiki_02 | https://en.wikipedia.org/wiki?curid=799 | 百科介绍 | \ No newline at end of file +| 文件位置 | 公网地址 | 公网地址用途 | +|-------------------------------------------------------------------|-------------------------------------|-------------| +| ModelZoo-PyTorch/PyTorch/dev/perf/mlperf_bert/pytorch/modeling.py | https://www.tensorflow.org/install/ | 三方库install | \ No newline at end of file diff --git a/PyTorch/dev/perf/mobilnetv3-cifar/public_address_statement.md b/PyTorch/dev/perf/mobilnetv3-cifar/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..6d909a3955ea5f141a5ee28416bec679b888b2e0 --- /dev/null +++ b/PyTorch/dev/perf/mobilnetv3-cifar/public_address_statement.md @@ -0,0 +1,4 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|------------------------------------------------------------|--------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/perf/mobilnetv3-cifar/SVHN.py | http://ufldl.stanford.edu/housenumbers/train_32x32.mat | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/mobilnetv3-cifar/SVHN.py | http://ufldl.stanford.edu/housenumbers/test_32x32.mat | 数据集链接 | \ No newline at end of file diff --git a/PyTorch/dev/perf/speechbrain-tdnn/public_address_statement.md b/PyTorch/dev/perf/speechbrain-tdnn/public_address_statement.md new file mode 100644 index 0000000000000000000000000000000000000000..653caef78a4bf77fac54bbf9e29a0ac5a7bda1c6 --- /dev/null +++ b/PyTorch/dev/perf/speechbrain-tdnn/public_address_statement.md @@ -0,0 +1,18 @@ +| 文件位置 | 公网地址 | 公网地址用途 | +|-----------------------------------------------------------------------------------------------------------------------|--------------------------------------------------------------------|---------| +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/recipes/VoxCeleb/SpeakerRec/hparams/train_ecapa_tdnn.yaml | https://www.robots.ox.ac.uk/~vgg/data/voxceleb/meta/veri_test2.txt | 数据及地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/recipes/VoxCeleb/SpeakerRec/hparams/train_x_vectors.yaml | https://www.robots.ox.ac.uk/~vgg/data/voxceleb/meta/veri_test2.txt | 数据及地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/recipes/VoxCeleb/SpeakerRec/hparams/verification_ecapa.yaml | https://www.robots.ox.ac.uk/~vgg/data/voxceleb/meta/veri_test2.txt | 数据及地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/recipes/VoxCeleb/SpeakerRec/hparams/verification_plda_xvector.yaml | https://www.robots.ox.ac.uk/~vgg/data/voxceleb/meta/veri_test2.txt | 数据及地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/recipes/VoxCeleb/voxceleb_prepare.py | http://www.robots.ox.ac.uk/~vgg/data/voxceleb/ | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/setup.py | speechbrain@gmail.com | 作者邮箱 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/speechbrain/lobes/augment.py | http://www.openslr.org/resources/28/rirs_noises.zip | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/speechbrain/lobes/models/EnhanceResnet.py | https://arxiv.org/abs/1709.01507 | 论文地址 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/speechbrain/lobes/models/fairseq_wav2vec.py | https://fairseq.readthedocs.io/en/latest/ | 三方库链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/enhancement/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/dev-clean-2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/enhancement/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/train-clean-5.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/enhancement/mini_librispeech_prepare.py | https://www.openslr.org/resources/12/test-clean.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/speaker_id/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/train-clean-5.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/speech_recognition/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/dev-clean-2.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/speech_recognition/mini_librispeech_prepare.py | http://www.openslr.org/resources/31/train-clean-5.tar.gz | 数据集链接 | +| ModelZoo-PyTorch/PyTorch/dev/perf/speechbrain-tdnn/templates/speech_recognition/mini_librispeech_prepare.py | https://www.openslr.org/resources/12/test-clean.tar.gz | 数据集链接 | \ No newline at end of file